Wednesday, January 4th 2023

AMD Ryzen 7000X3D Announced, Claims Total Dominance over Intel "Raptor Lake," Upcoming i9-13900KS Deterred

AMD today announced its Ryzen 7000X3D "Zen 4" desktop processors with 3D Vertical Cache technology. With these, the company is claiming to have the world's fastest processors for gaming. The company claims to have beaten the Intel Core i9-13900K "Raptor Lake" in gaming, by a margin it feels comfortable to remain competitive with against even the upcoming Core i9-13900KS. At the heart of these processors is the new "Zen 4" 3D Vertical Cache (3DV cache) CCD, which features 64 MB of L3 cache stacked on top of the region of the "Zen 4" CCD that has the on-die 32 MB L3 cache. The 3DV cache runs at the same speed as the on-die L3 cache, and is contiguous with it. The CPU cores see 96 MB of transparent addressable L3 cache.

3DV cache is proven to have a profound impact on gaming performance with the Ryzen 7 5800X3D "Zen 3" processor that helped it beat "Alder Lake" in gaming workloads despite "Zen 3" being a generationally older microarchitecture; and AMD claims to have repeated this magic with the 7000X3D "Zen 4" series, enabling it to beat Intel "Raptor Lake." Unlike with the 5800X3D, AMD don't intend to make gaming performance a trade-off for multi-threaded creator performance, and so it is introducing even 12-core and 16-core SKUs, so you get gaming performance alongside plenty of muscle for creator workloads.
The series consists of three SKUs, the 8-core/16-thread Ryzen 7 7800X3D, the 12-core/24-thread Ryzen 9 7900X3D, and the flagship 16-core/32-thread Ryzen 9 7950X3D. The 7800X3D comes with an unknown base frequency above the 4.00 GHz-mark, along with up to 5.00 GHz boost. The 7900X3D has 4.40 GHz base frequency, and up to 5.60 GHz boost. The flagship 7950X3D ticks at 4.20 GHz base, and boosts up to 5.70 GHz.

There's something interesting about the cache setup of the three SKUs. The 7800X3D has 104 MB of total cache (L2+L3), whereas the 7900X3D has 140 MB and the 7950X3D has 144 MB. The 8-core CCD in the 7800X3D has 64 MB of 3DV cache stacked on top of the 32 MB on-die L3 cache, resulting in 96 MB of L3 cache, and with each of the 8 cores having 1 MB of L2 cache, we arrive at 104 MB total cache. Logically, the 7900X3D and 7950X3D should have 204-208 MB of total cache, but they don't.

While we await more details from AMD on what's happening here, there are two theories—one holds that the 3DV cache for the 7900X3D and 7950X3D is just 32 MB per chiplet, or 64 MB L3 cache per CCD. 140 MB total cache for the 7900X3D would hence come from ((2 x 64 MB L3) + (12 x 1 MB L2)); and for the 7950X3D this would be ((2 x 64 MB L3) + (16 x 1 MB L2)).

The second more radical theory holds that only one of the two CCDs has 64 MB of 3DV cache stacked on top of the on-die 32 MB L3 cache, and the other is a conventional "Zen 4" CCD with just 32 MB of on-die L3 cache. The math checks out. Dating all the way back to the Ryzen 3000 "Zen 2" Matisse dual-CCD processors, AMD has worked with Microsoft to optimize Windows 10 and Windows 11 schedulers to localize gaming workloads to one of the two CCDs (using methods such as CPPC2 preferred-core flagging), so if these processors indeed have an asymmetric L3 cache setup between the two CCDs, the one with the 3DV cache would be preferred by the OS for gaming workloads.

In its presentation, AMD uses the term "the world's best gaming processor" with the 7800X3D and not the 7950X3D. This should mean that despite its lower maximum boost frequency, the 7800X3D should offer the best gaming performance among the three SKUs, and very likely features 96 MB of L3 cache for the CCD; whereas the 7900X3D and 7950X3D feature either lower amounts of 3DV cache per CCD, or that asymmetric L3 cache setup we theorized.
In terms of performance, AMD is claiming anywhere between 21% to 30% gaming performance gains for the 7800X3D over the previous-generation 5800X3D. This can be associated with the IPC increase of the "Zen 4" core, and faster DDR5 memory. AMD claims that the 7800X3D should particularly shine with CPU-limited gaming scenarios, such as lower-resolution high refresh-rate setups.

The 7950X3D is claimed to beat the Core i9-13900K in gaming performance by anywhere between 13% to 24% in the four tests AMD showed, while also offering big gains in multi-threaded productivity benchmarks. Especially in workloads involving large streaming data, such as file-compression and DaVinci Resolve, the 7950X3D is shown offering between 24% to 52% performance leads over the i9-13900K (which we doubt the i9-13900KS can make up for).

The Ryzen 7000X3D processors will be available from February 2023, and should be drop-in compatible with existing Socket AM5 motherboards, with some boards requiring a BIOS update. The USB BIOS Flashback feature is standardized by AMD across motherboard brands, so this shouldn't be a problem.
Add your own comment

177 Comments on AMD Ryzen 7000X3D Announced, Claims Total Dominance over Intel "Raptor Lake," Upcoming i9-13900KS Deterred

#76
imeem1
I notice on amd's site, 7950x3d says " Native PCIe® Lanes (Total/Usable) 28/23 " .
which is less than 7950x's 28/24 .

how impactful is this?
Posted on Reply
#77
lightning70
I am sure it will be the most successful processor in games. R9 7950X3D
Posted on Reply
#78
The Von Matrices
imeem1I notice on amd's site, 7950x3d says " Native PCIe® Lanes (Total/Usable) 28/23 " .
which is less than 7950x's 28/24 .

how impactful is this?
I'm pretty sure that's a typo. The unusable lanes are for the chipset and there is no such thing as a PCIe x5 interface.
Posted on Reply
#79
529th
Would love to see the 7950X3D with the 3D cache on the best CCD .. imagine 3D cache with 5.7GHz ST: I'd disable the non 3D cache CCD all day long, lol
Posted on Reply
#80
Kei
kapone32Imagine a 5800X3D at 5Ghz on even a single core.
That's exactly what has me salivating lol. I had a 5800x until very recently and I thought it was going to be a bit longer before the 7k3D parts so I bought a 5800x3D because I do a lot of VR flight simulator stuff and other things it benefits.......now with this announcement I'm about to buy an AM5 setup and camp outside for a 7800x3D haha.
Posted on Reply
#81
Mussels
Freshwater Moderator
fevgatosVery reliable result, the 12700k has better lows than the 13900k, lol.
Exactly.
Because the 13900K throttles a lot harder, while the 12700K doesnt run into it's TDP and thermal issues as often.

A CPU that never reaches its throttles (like when an all core OC is used) is always performing equally, but a CPU that relies on very drastic differences between base clock and boost clock is going to have those lows, any time a limit is reached - be it thermal or PL1/PL2 related
Posted on Reply
#82
JustBenching
MusselsExactly.
Because the 13900K throttles a lot harder, while the 12700K doesnt run into it's TDP and thermal issues as often.

A CPU that never reaches its throttles (like when an all core OC is used) is always performing equally, but a CPU that relies on very drastic differences between base clock and boost clock is going to have those lows, any time a limit is reached - be it thermal or PL1/PL2 related
You think that the 13900k throttles in farcry 6? LOL. Man, what the actual heck. This is just trolling, I don't believe for a second you believe what you just said. It barely draw 100watts in farcry 6 but somehow it's throttling! There is not a single game in existence that makes the 13900k throttle to either power or thermal limitations.
Posted on Reply
#83
Crylune
I'm good, my 5800X3D is fine and will carry my 4090 for the next few years just fine, especially since I use DLDSR 2.25x to get my games rendered at 4K.

Waiting for Zen 5 3D V-Cache.
Posted on Reply
#84
Kei
CryluneI'm good, my 5800X3D is fine and will carry my 4090 for the next few years just fine, especially since I use DLDSR 2.25x to get my games rendered at 4K.

Waiting for Zen 5 3D V-Cache.
I am anxiously awaiting reviews for them before I truly decide, but if I don't love what I see then I'll be doing the same as you (same system specs) and be more than happy.

I hope there is a decent little bump though but we'll see...
Posted on Reply
#85
Crylune
KeiI am anxiously awaiting reviews for them before I truly decide, but if I don't love what I see then I'll be doing the same as you (same system specs) and be more than happy.

I hope there is a decent little bump though but we'll see...
I don't have a real reason to swap out my 5800X3D considering it's better than a 12900KS and I don't gain much since my games are 4K rendered, as I mentioned. Based on TechPowerUp's reviews of 53 games with the 4090 paired with a 5800X3D vs 13900K, the 13900K is on average 4.7% faster... hurray. And the platform cost of AM5 and these new X3D CPUs is not something I want to dabble in right now.

It will do just fine until Zen 5 3D V-Cache which is when I'll be doing a complete system rebuild, and get a 5090 or something, granted if there is stock. If not, I'll keep my 4090.
Posted on Reply
#86
JustBenching
CryluneI don't have a real reason to swap out my 5800X3D considering it's better than a 12900KS and I don't gain much since my games are 4K rendered, as I mentioned. Based on TechPowerUp's reviews of 53 games with the 4090 paired with a 5800X3D vs 13900K, the 13900K is on average 4.7% faster... hurray. And the platform cost of AM5 and these new X3D CPUs is not something I want to dabble in right now.

It will do just fine until Zen 5 3D V-Cache which is when I'll be doing a complete system rebuild, and get a 5090 or something, granted if there is stock. If not, I'll keep my 4090.
The 3d isn't better than the 12900k or the ks. Its just isnt
Posted on Reply
#87
Crylune
fevgatosThe 3d isn't better than the 12900k or the ks. Its just isnt
That's why it tops it in most games and especially the games I play, while costing much less, drawing MUCH less power, and being able to do so with much crappier memory, right?

Of course it's not better in productivity, but this isn't a productivity rig. It's a gaming rig. I swapped the 5900X for this. And the little productivity I do it handles more than fine. Now, as for games, the 12900K/KS would never have given me the same stutter free experience in VR games and the fantastic 0.1 and 1% lows which is what I'm after. Sorry, but 'it just won't'.

Edit: ah, you own a 12900K, def some buyer bias going on.
Posted on Reply
#88
Bagerklestyne
Ok, I can't brain this one

7900X3D - What's the cache and core layout look like ?

it's only one CCD with 64 meg extra cache right ?

So is it an 8 core CCD with and extra 64 meg - in which case the other CCD is 4 core
Or is it a 2x 6 core CCD with (one with extra 64 meg) - in which case the cores get more cache and there's now no technical reason we can't have a 7600X3D
Posted on Reply
#89
JustBenching
CryluneThat's why it tops it in most games and especially the games I play, while costing much less, drawing MUCH less power, and being able to do so with much crappier memory, right?

Of course it's not better in productivity, but this isn't a productivity rig. It's a gaming rig. I swapped the 5900X for this. And the little productivity I do it handles more than fine. Now, as for games, the 12900K/KS would never have given me the same stutter free experience in VR games and the fantastic 0.1 and 1% lows which is what I'm after. Sorry, but 'it just won't'.

Edit: ah, you own a 12900K, def some buyer bias going on.
No it's not. And....using your logic, ah you own a 3d, def some buyer bias going on. Right? Or it only applies to other people? :p

I don't own a 12900k anymore,, got a 13900k.

Performance from this very site, the 3d is far down the list, nowhere near close the 12900k

tpucdn.com/review/intel-core-i9-13900k/images/relative-performance-games-1280-720.png
Posted on Reply
#91
JustBenching
wNotyarDDo you play at 720p?
Does it matter? He said the 3d is faster than the 12900ks, which is not the case. Does he play at 720p?
Posted on Reply
#92
wNotyarD
fevgatosDoes it matter? He said the 3d is faster than the 12900ks, which is not the case. Does he play at 720p?
He didn't say it's faster, he said it's better. Justifying it for having better lows than either 12900K and 13900K, thus making his gameplay more fluid.
Posted on Reply
#93
JustBenching
wNotyarDHe didn't say it's faster, he said it's better. Justifying it for having better lows than either 12900K and 13900K, thus making his gameplay more fluid.
But its not better, either at lows or average or what have you. Its far behind as per the review from this very site i just linked.
Posted on Reply
#94
kapone32
fevgatosDoes it matter? He said the 3d is faster than the 12900ks, which is not the case. Does he play at 720p?
I think your 5800X3D is not working properly.
Posted on Reply
#95
JustBenching
kapone32I think your 5800X3D is not working properly.
Does yours? Go ahead then, upload a cyberpunk 2077 run, spiderman, spiderman miles morales, farcry 6, valorant, cs go, choose your poison.
Posted on Reply
#96
qubit
Overclocked quantum bit
If the 7800X3D is as good as this article suggests, then Intel have a big problem.

I'm gonna seriously consider this for my 2700K upgrade once the reviews are out. Will be really nice to dodge the e-core bullet, if nothing else.
Posted on Reply
#97
wheresmycar
qubitIf the 7800X3D is as good as this article suggests, then Intel have a big problem.

I'm gonna seriously consider this for my 2700K upgrade once the reviews are out. Will be really nice to dodge the e-core bullet, if nothing else.
mate i've done 4 upgrades since the 2700K (> 4790K > 7700K > 9700K)..... i'm also waiting to see how the 7800X3D plays out although not feeling the higher premiums for the platform/DDR5 swap. Got a couple of freely handed AM4's sitting around with the 5800X3D being the upgrade backup.

Just curious, is the 2700K your daily game driver? Mine couldn't handle some of the newer titles around 10 years ago (maybe exaggerated/or closer to home).
Posted on Reply
#98
Eskimonster
Space Lynxthey just announced live the best supercomputer chip ever made in history. 146 billion transistors, and most of the show was about innovation in healthcare, robotic surgery, Lisa Su also brought on a female Astronaut to talk about AMD has helped Artemis to the Moon, and other NASA relationships, etc...

Did you watch the Live Show at all, or just go based off tech threads?

very small part of it was for gaming. AMD really did a great job tonight, Lisa Su was fantastic.
No her Green jacket was Fantastic /s
Posted on Reply
#99
qubit
Overclocked quantum bit
wheresmycarmate i've done 4 upgrades since the 2700K (> 4790K > 7700K > 9700K)..... i'm also waiting to see how the 7800X3D plays out although not feeling the higher premiums for the platform/DDR5 swap. Got a couple of freely handed AM4's sitting around with the 5800X3D being the upgrade backup.

Just curious, is the 2700K your daily game driver? Mine couldn't handle some of the newer titles around 10 years ago (maybe exaggerated/or closer to home).
4 upgrades? It has been a while for me! :laugh:

Yes, the 2700K is my main PC; I don't have anything higher spec than this - see specs for full info. Note that I've upgraded just about everything other than the CPU, supporting components and case since I built it in 2011.

Well, on the desktop, it feels as snappy as ever. Seriously, no slowdown at all since I first built it, hence Microsoft hasn't made Windows any slower. Fantastic. I don't run any intensive apps that would really show up the lack of performance compared to a modern system.

Now, while I do have hundreds of games, I haven't played that many of them (Steam is very efficient at separating me from my money with special offers lol) or that often.

I ran Cyberpunk2077 and got something like 15-25fps even when dropping screen res and details right down, so it's no good for that. In hindsight, I should have gotten my money back, nvm.

CoD: Modern Warfare (the newer one) runs quite well at 60-110fps or so. Jumps around a lot, but with my newish G-SYNC monitor, that hardly matters and it plays fine. Even before that, the experience was still good, but not great, especially if I set the screen refresh rate to 144Hz and vsync off. Felt very responsive like that. Note that my 2080 Super easily plays this game at 4K. I don't have all the details maxed out though, regardless of resolution. I don't like motion blur and ambient occlusion doesn't make that much visual difference, so I turn them both off, for example, but both really reduce the performance.

CoD: Modern Warfare II Warzone 2.0 runs with rather less performance and can drop down into the stuttery 40fps which is below what the G-SYNC compatibility will handle, but otherwise not too bad. It also tends to hitch a bit, but my console friends reported that too, so is a game engine problem, not my CPU.

I've got CoD games running back generations and they all work fine. Only the latest one struggles to any degree.

I've run various other games which worked alright too, can't remember the details now. It's always possible to pick that one game that has a really big system load, or is badly optimised and runs poorly, but that can happen even with a modern CPU.

I have a feeling that this old rig, with its current spec, can actually game better than a modern gaming rig with a low end CPU. Haven't done tests of course, but it wouldn't surprise me.

Agreed, I don't like the greater expense for AMD either, so the devil will be in the details. I want to see what the realworld performance uplift will be compared to the 13700K I have my eye on before I consider my upgrade. Thing is, every time, I think I'm finally gonna pull the trigger, the goal posts move! The real deadline here of course, is Windows 10 patch support is gonna end in 2025, so it's gonna happen for sure by then.

And finally, out of interest, here's the thread I started when I upgraded to my trusty 2700K all those years ago. It's proven to be a superb investment to last this long and still be going strong.

www.techpowerup.com/forums/threads/qubits-sandy-upgrade.155874/
Posted on Reply
Add your own comment
Dec 18th, 2024 07:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts