Thursday, May 9th 2019

AMD Ryzen 9 3000 is a 16-core Socket AM4 Beast

AMD is giving finishing touches to its 3rd generation Ryzen socket AM4 processor family which is slated for a Computex 2019 unveiling, followed by a possible E3 market availability. Based on the "Matisse" multi-chip module that combines up to two 8-core "Zen 2" chiplets with a 14 nm I/O controller die, these processors see a 50-100 percent increase in core-counts over the current generation. The Ryzen 5 series now includes 8-core/16-thread parts, the Ryzen 7 series chips are 12-core/24-thread, while the newly created Ryzen 9 series (designed to rival Intel Core i9 LGA115x), will include 16-core/32-thread chips.

Thai PC enthusiast TUM_APISAK confirmed the existence of the Ryzen 9 series having landed himself with an engineering sample of the 16-core/32-thread chip that ticks at 3.30 GHz with 4.30 GHz Precision Boost frequency. The infamous Adored TV leaks that drew the skeleton of AMD's 3rd generation Ryzen roadmap, referenced two desktop Ryzen 9 parts, the Ryzen 9 3800X and Ryzen 9 3850X. The 3800X is supposed to be clocked at 3.90 GHz with 4.70 GHz boost, with a TDP rating of 125W, while the 3850X tops the charts at 4.30 GHz base and a staggering 5.10 GHz boost. The rated TDP has shot up to 135W. We can now imagine why some motherboard vendors are selective with BIOS updates on some of their lower-end boards. AMD is probably maximizing the clock-speed headroom of these chips out of the box, to preempt Intel's "Comet Lake" 10-core/20-thread processor.
Sources: TUM_Apisak, Tom's Hardware
Add your own comment

197 Comments on AMD Ryzen 9 3000 is a 16-core Socket AM4 Beast

#176
bug
efikkanIt's a common misunderstanding that multicore scaling is primarily a lack of good software. I explained this some more here.
TLDR; most real-world tasks can't scale across an arbitrary number of cores, so unless you're running more tasks or you're running more typical servers, more and more cores is only going to give you diminishing returns, and even lower performance if you at some point have to sacrifice core performance for more cores.

Single core performance is essential and will become only more important in the next years, even for those processes which uses many threads, due to the synchronization overhead. But the clockspeed race seems to be nearly over, so future gains will come from IPC increases.
Unfortunately, understanding all that requires programming knowledge. Most people don't have that.
At some point I crossed paths with a guy with several years of programming experience, rather well regarded within his team. When tasked with something that required a mild amount of concurrency, he said "I'm going to need some time to get familiar with this threading thing". So if a guy programming for a living can do that, good luck explaining cores and threads to the layman.
Posted on Reply
#177
efikkan
bugUnfortunately, understanding all that requires programming knowledge. Most people don't have that.

At some point I crossed paths with a guy with several years of programming experience, rather well regarded within his team. When tasked with something that required a mild amount of concurrency, he said "I'm going to need some time to get familiar with this threading thing". So if a guy programming for a living can do that, good luck explaining cores and threads to the layman.
Sure. My main point is that it's not just lack of willingness to adapt multithreading, hard problems are actually hard to solve.

Your story doesn't surprise me at all. Of all the programmers I've dealt with over more than a decade and a half, probably less than 5% is at that level of competence to deal with problems this complex. Even a typical programmer with 10 years of experience wouldn't even fully grasp the problem, even if explained in detail. The wast majority of programmers don't touch anything this low level, like web developers, app developers and most writing enterprise software in Java or C#, so they never develop an understanding of how it works.
Posted on Reply
#178
bug
efikkanSure. My main point is that it's not just lack of willingness to adapt multithreading, hard problems are actually hard to solve.

Your story doesn't surprise me at all. Of all the programmers I've dealt with over more than a decade and a half, probably less than 5% is at that level of competence to deal with problems this complex. Even a typical programmer with 10 years of experience wouldn't even fully grasp the problem, even if explained in detail. The wast majority of programmers don't touch anything this low level, like web developers, app developers and most writing enterprise software in Java or C#, so they never develop an understanding of how it works.
Well, if you think about it, higher level languages actually trivialize multi-threading (think Java's executors, Erlang's spawn or Go's goroutines). The fact that even with this help programs don't fully load cores is further proof things we routinely do don't really need that many cores.
On the other hand, if we, programmers, would bubble sort and brute force everything, the whiny bunch would actually be much happier. Their cores would suddenly be seeing 100% usage.
Posted on Reply
#179
efikkan
bugWell, if you think about it, higher level languages actually trivialize multi-threading (think Java's executors, Erlang's spawn or Go's goroutines). The fact that even with this help programs don't fully load cores is further proof things we routinely do don't really need that many cores.
This only really works if you're writing something that spawns independent worker threads, and each of them do a fairly large chunk of work(so the average overhead becomes small). This mostly applies to typical server workloads, and is hard to apply efficiently in normal desktop applications.
Some languages have ways to distribute functions across several worker threads, but it usually creates more synchronization overhead and problems than it solves.
Posted on Reply
#180
EarthDog
Details... :p

Still waiting for software to manage...whatever that means on the programming side. :)
Posted on Reply
#181
bug
efikkanThis only really works if you're writing something that spawns independent worker threads, and each of them do a fairly large chunk of work(so the average overhead becomes small). This mostly applies to typical server workloads, and is hard to apply efficiently in normal desktop applications.
Some languages have ways to distribute functions across several worker threads, but it usually creates more synchronization overhead and problems than it solves.
Even so, you can have an architect or tech lead come up with the blue prints. My point was, from a programming point of view, threading has become pretty trivial. Using threading to speed up things, like you point out, is a whole other story - not everything will become faster because you spread the load over more threads. You can split the AI in a game and let it run amok, but unless that game happens to be chess, it still has to sync up with user input and whatnot.
Now let's try to get back on topic ;)
Posted on Reply
#182
NdMk2o1o
So octo cores are OK now they're mainstream but anymore and it's a no no. Gotcha, let us know when it's OK by you when we can use more than 8 cores
Posted on Reply
#183
bug
NdMk2o1oSo octo cores are OK now they're mainstream but anymore and it's a no no. Gotcha, let us know when it's OK by you when we can use more than 8 cores
Buying CPUs by the core disregarding your actual needs is a no no, but seemingly also beyond your comprehension ability.
Where the hell did you pull that 8 core limitation from anyway?
Posted on Reply
#184
NdMk2o1o
bugBuying CPUs by the core disregarding your actual needs is a no no, but seemingly also beyond your comprehension ability.
Where the hell did you pull that 8 core limitation from anyway?
You could say the same about any high end computer component purchase, when you can buy something for a fraction that will still do the same job, or the guys with 32/64gb ram when 8/16 is sufficient for 90% of people. That's the key, 16 core CPU's won't be bought by the 90%, I won't buy one, I have 6c12t and that's plenty by me, still I don't try and tell other people what they should buy based on my use. I don't see what the big deal is, if you don't see a need for it then you won't buy it, simple and of course there are those with more money than sense who will buy it just because, that's no different to how its ever been, regardless of if its 8/16/32c or whatever
EarthDogThe real winners here, and I've said this before, are the cheap quad/hex/octo w/SMT. Much more than that, few need to care.
And sorry I'm still catching up with the thread I was replying to this comment from the previous page, my bad
Posted on Reply
#185
EarthDog
NdMk2o1oSo octo cores are OK now they're mainstream but anymore and it's a no no. Gotcha, let us know when it's OK by you when we can use more than 8 cores
Jesus, no forest through the trees with this guy. :(

Edit: or just not finishing the thread before replying... as I just did seeing that comment. Haha!

As I've said before, what this does is setup the lemmings and those not in the know (95% of people) to think more cores are better. And to an extent, that is true. But for most users a 6c/12tcpu is plenty and doesnt bottleneck anyhing (and wont for years).. so hes, I'm annoyed the mainstream is packing in cores. At least with clockspeeds.. EVERYONE benefits. Cores... few do.
Posted on Reply
#186
Aquinus
Resident Wat-man
Honestly, if this pans out to be true, I might consider going with this 16c chip. I'm already at the point of thinking about upgrading since X79, at this point, is a pretty dated platform and I wanted to replace it with a 2950X, but if I can get 16c/32t for 500 USD instead of 800, I'm all for it (forget the cost of a TR motherboard.) In reality I don't need all of that PCIe goodness and if dual-channel DDR4 can keep at least 12c/24t fully fed under memory intensive tasks, then I think I found my next upgrade.

I've waited this long, so let's just wait and see what happens. :D
Posted on Reply
#187
TheLostSwede
News Editor
AquinusHonestly, if this pans out to be true, I might consider going with this 16c chip. I'm already at the point of thinking about upgrading since X79, at this point, is a pretty dated platform and I wanted to replace it with a 2950X, but if I can get 16c/32t for 500 USD instead of 800, I'm all for it (forget the cost of a TR motherboard.) In reality I don't need all of that PCIe goodness and if dual-channel DDR4 can keep at least 12c/24t fully fed under memory intensive tasks, then I think I found my next upgrade.

I've waited this long, so let's just wait and see what happens. :D
Expect X570 boards to be pricey. This might be a reason why AMD is keeping the CPU prices on the low. Don't expect the exact pricing from AdoredTV's leak though, as prices have changed since then. Two weeks to go...
Posted on Reply
#188
Aquinus
Resident Wat-man
TheLostSwedeExpect X570 boards to be pricey. This might be a reason why AMD is keeping the CPU prices on the low. Don't expect the exact pricing from AdoredTV's leak though, as prices have changed since then. Two weeks to go...
Expensive, sure, but TR expensive? Probably not. We'll know soon enough. :D
Posted on Reply
#189
Leaked
TheLostSwedeExpect X570 boards to be pricey. This might be a reason why AMD is keeping the CPU prices on the low. Don't expect the exact pricing from AdoredTV's leak though, as prices have changed since then. Two weeks to go...
Will the 16 Cores released later or with the other CPUs?
Posted on Reply
#190
lsevald
What is driving the price of x570 boards up so much? I guess we can use previous generation boards if its that bad?
Posted on Reply
#191
GoldenX
lsevaldWhat is driving the price of x570 boards up so much? I guess we can previous generation boards if its that bad?
Wait for the BIOS updates.
Posted on Reply
#192
HTC
lsevaldWhat is driving the price of x570 boards up so much? I guess we can previous generation boards if its that bad?
Possibly the lack of volume with the X570 chipsets? I've read the reason Zen 2 had been delayed was for it to wait for the chipset because the CPUs themselves were ready for a while, but i'm not 100% sure if this is true or not.

OTOH, it could very well be board makers attempting to squeeze us consumers ...
Posted on Reply
#193
TheLostSwede
News Editor
AquinusExpensive, sure, but TR expensive? Probably not. We'll know soon enough. :D
Probably not, but expect a $30-50 premium over current boards, if the board makers don't shoulder any of the cost.
LeakedWill the 16 Cores released later or with the other CPUs?
Unclear. AMD hasn't told the board makers the launch lineup as yet. But it's correct that there are 12 and 16 core parts with the board makers, as per various rumours.
lsevaldWhat is driving the price of x570 boards up so much? I guess we can previous generation boards if its that bad?
The extra parts needed for PCIe 4.0. Two sets are needed for a full PCIe 4.0 board, one set minimum. These are in addition to the cost of the chipset itself. It would seem there's no issues to use current boards though, you just don't get the new board related features.
HTCPossibly the lack of volume with the X570 chipsets? I've read the reason Zen 2 had been delayed was for it to wait for the chipset because the CPUs themselves were ready for a while, but i'm not 100% sure if this is true or not.

OTOH, it could very well be board makers attempting to squeeze us consumers ...
No squeezing, it's all about PCIe 4.0. Extra re-drivers and re-timers are needed and these are costly for PCIe 4.0 right now, since very few products need them. It'll likely change over time.
Posted on Reply
#194
juiseman
I don't see 16 core CPU's as a bad thing; Parallel processing is the only way go from here until someone figures out how to
overcome the current limitations. It seems they pushed the envelope as far they can already. So IPC and more cores is the only
way to sell & market something new for the chip makers.
Things get hairy after 5GHZ-ish; efficiency goes down remarkably after that. Test results
repeatedly show this.
Just remember; we would all still be on 4 cores if AMD didn't force the 6 and 8 core CPU's into the mainstream.
AMD doing good is good for everybody. AMD or Intel fans.
I'm all Intel now myself; but if AMD puts out a good CPU at a good price; I could make use of all those extra cores..
There is a large amount of people that would say the same.
Posted on Reply
#195
xorbe
I need this budget priced 16/32 cpu for some hobbyist linux box fun!
Posted on Reply
#196
Aquinus
Resident Wat-man
TheLostSwedeProbably not, but expect a $30-50 premium over current boards, if the board makers don't shoulder any of the cost.
To get 16c/32t for 300 USD less than the 2950x? I could live with that.
Posted on Reply
#197
storm-chaser
juisemanJust remember; we would all still be on 4 cores if AMD didn't force the 6 and 8 core CPU's into the mainstream.
AMD doing good is good for everybody. AMD or Intel fans.
I'm all Intel now myself; but if AMD puts out a good CPU at a good price; I could make use of all those extra cores..
There is a large amount of people that would say the same.
While AMD did achieve something great with the release of their multi core CPUs, it was Intel who first released a quad core CPU to the market.
While I agree with you that AMD capitalized on the market by producing 6 and 8 core CPUs respectively, I don't think we'd still be stuck on four cores in the mainstream. Could be wrong though...
Posted on Reply
Add your own comment
Mar 15th, 2025 21:10 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts