Thursday, May 9th 2019

AMD Ryzen 9 3000 is a 16-core Socket AM4 Beast

AMD is giving finishing touches to its 3rd generation Ryzen socket AM4 processor family which is slated for a Computex 2019 unveiling, followed by a possible E3 market availability. Based on the "Matisse" multi-chip module that combines up to two 8-core "Zen 2" chiplets with a 14 nm I/O controller die, these processors see a 50-100 percent increase in core-counts over the current generation. The Ryzen 5 series now includes 8-core/16-thread parts, the Ryzen 7 series chips are 12-core/24-thread, while the newly created Ryzen 9 series (designed to rival Intel Core i9 LGA115x), will include 16-core/32-thread chips.

Thai PC enthusiast TUM_APISAK confirmed the existence of the Ryzen 9 series having landed himself with an engineering sample of the 16-core/32-thread chip that ticks at 3.30 GHz with 4.30 GHz Precision Boost frequency. The infamous Adored TV leaks that drew the skeleton of AMD's 3rd generation Ryzen roadmap, referenced two desktop Ryzen 9 parts, the Ryzen 9 3800X and Ryzen 9 3850X. The 3800X is supposed to be clocked at 3.90 GHz with 4.70 GHz boost, with a TDP rating of 125W, while the 3850X tops the charts at 4.30 GHz base and a staggering 5.10 GHz boost. The rated TDP has shot up to 135W. We can now imagine why some motherboard vendors are selective with BIOS updates on some of their lower-end boards. AMD is probably maximizing the clock-speed headroom of these chips out of the box, to preempt Intel's "Comet Lake" 10-core/20-thread processor.
Sources: TUM_Apisak, Tom's Hardware
Add your own comment

197 Comments on AMD Ryzen 9 3000 is a 16-core Socket AM4 Beast

#151
efikkan
TheLostSwedeLet me be like you two, show me proof. Can you verify any of the stuff you just wrote? Links? Sources?
You know how evidence works, right?
If you claim that Zen 2 was to release in February/March, something that runs contrary to the established knowledge/baseline, then you're the one who have to prove something.

Proving a negative is hard/impossible, but as I've pointed out there is a lack of telltale signs of an impending release, signs that we have seen popping up afterwards and even you have acknowledged, evidence pointing to a release in the "middle" of the year, just like Lisa said during CES, so I don't have to prove anything.
TheLostSwedeYou two are speculating a lot, which you're free to do, but at the same time, you then need to accept that others have the same right.
As I've said several times in this thread already, there is nothing wrong in speculating.
But there are not really any speculation in the post you quoted.

Why do you keep pretending that I say things that I don't? Calm down. If your case is strong, you should be able to make your case without insulting people.
TheLostSwedeI know for a fact that certain things have happened, I don't know why and I don't have all the details…
There are probably hundreds or more who have some degree of access to NDA'ed information or engineering samples at this point, and I don't doubt that, but that doesn't mean you have access to every detail. And there are certain information which are not available until close to release, even for AMD's own engineering team, simply because the details are not ready yet.
TheLostSwede…but I hope you two are at least big enough as people that you can admit that you have called people names for the wrong reasons once the product launch.
I have stated the fact that people can't know a fact before it exists, and those who claim to are lying, that is not name calling, that's pretty much what the definitions of what facts and speculation are. And this has nothing to do with AMD, this is the same for all of the makers, and have nothing to do with how good Zen 2 turns out to be. Speculation is speculation no matter how accurate it turns out to be.
Posted on Reply
#152
TheLostSwede
News Editor
Shatun_BearJanuary or February, the CPUs were not even close to releasing it makes little difference.

I mentioned APUs as AdoredTv included every Ryzen 3000 series APU in his chart back in December alongside the CPUs, which is rather laughable on its own as these may launch in 2020 as they're getting refreshed on 12nm this year.

And in terms of plans changing, sure, but there's a difference between Kickstarter and a billion dollar company.

And my only contention is that AdoredTV's numbers are guesses or fake, as I keep repeating, 4.3Ghz base clock on a 16-core CPU is not happening, nor is 4.2Ghz base clock on a 12-core (his numbers). I will not be making a public apology regarding this as no SKU will have these base clock frequencies.

I'm anticipating boost clocks between 4.6-4.8Ghz, and if AdoredTV's figures are 'close' to these, that's no vindication at all of his chart as anyone can make an educated guess. What wasn't educated is 4.3Ghz and 4.2Ghz base clocks, which is the giveaway the numbers are fake.
How do you know this? Where's the proof? You question my knowledge, so I'll question yours in the same way. You have no more proof than I do, in fact, you have less.

Am I Jim? No. Is Jim my source? No.

Yes and No. You clearly have never developed a product.

Well, you believe whatever you want to believe. I know a few of the clocks, but I've promised not to share the information, but let's just say it can be done. I really do expect you to apologise to both me and Jim when the time comes. You're incredibly stubborn for someone that has zero information about the topic you're discussing.
efikkanYou know how evidence works, right?
If you claim that Zen 2 was to release in February/March, something that runs contrary to the established knowledge/baseline, then you're the one who have to prove something.

Proving a negative is hard/impossible, but as I've pointed out there is a lack of telltale signs of an impending release, signs that we have seen popping up afterwards and even you have acknowledged, evidence pointing to a release in the "middle" of the year, just like Lisa said during CES, so I don't have to prove anything.


As I've said several times in this thread already, there is nothing wrong in speculating.
But there are not really any speculation in the post you quoted.

Why do you keep pretending that I say things that I don't? Calm down. If your case is strong, you should be able to make your case without insulting people.


There are probably hundreds or more who have some degree of access to NDA'ed information or engineering samples at this point, and I don't doubt that, but that doesn't mean you have access to every detail. And there are certain information which are not available until close to release, even for AMD's own engineering team, simply because the details are not ready yet.


I have stated the fact that people can't know a fact before it exists, and those who claim to are lying, that is not name calling, that's pretty much what the definitions of what facts and speculation are. And this has nothing to do with AMD, this is the same for all of the makers, and have nothing to do with how good Zen 2 turns out to be. Speculation is speculation no matter how accurate it turns out to be.
I have nothing more or less to prove than you do. I'm not the one arguing that the leaks are a lie.

I don't have to speculate, I know facts. I never said I have all the facts.

Dude, no offence, but I'm going to block you, as your logic is so flawed I can't deal with you any more.

Posted on Reply
#153
HenrySomeone
ZareekAgreed but the leaks say the R9 3850X 16c/32t @ 4.3Ghz base 5.1Ghz boost will have the highest frequency. That is followed by R7 3700X 12c/24t @ 4.2Ghz base 5Ghz boost. Personally, I'd like an 8 core with the best clocks they can pull off! The leaks on IPC are all over the place claiming anywhere from 10-29% depending on task. All of them seem to agree AMD will still lag in gaming.
I am almost a 100% that even the best that Zen2 will muster, will still be over 10% slower than 8700k/9700k/9900k in gaming, in some cases probably more like 20, maybe even 25 (those are the cases where current Zen+ chips lag 40-45% :D)
Posted on Reply
#154
R0H1T
Is that stock 9900k vs 2700x & which games? This up to 45% margin sounds bogus to me.
Posted on Reply
#156
bug
@HenrySomeone I have this strange feeling someone buying a high-end CPU is unlikely to be stuck playing at FHD.
Posted on Reply
#157
storm-chaser
I think one thing we can all agree on is we live in fascinating times. Lets just enjoy the lead out and hype, if that's what you want to call it.

Nobody should be having hurt feelings here. Tech discussion should be fun, first and foremost.
Posted on Reply
#158
bug
storm-chaserI think one thing we can all agree on is we live in fascinating times. Lets just enjoy the lead out and hype, if that's what you want to call it.

Nobody should be having hurt feelings here. Tech discussion should be fun, first and foremost.
Not that fascinating, depending on how long you've been tracking things, but a shake-up we haven't seen since Intel unveiled the Core architecture.
Posted on Reply
#159
HenrySomeone
bug@HenrySomeone I have this strange feeling someone buying a high-end CPU is unlikely to be stuck playing at FHD.
Irrelevant, he wanted to see at least one case of top Intel chips being up to 45% faster and I provided and besides, high refresh rate 1080p gaming actually requires the most cpu power anyway, since there's obviously less gpu bottlenecking and also, today's 1080p difference is tomorrow's 1440p
bugNot that fascinating, depending on how long you've been tracking things, but a shake-up we haven't seen since Intel unveiled the Core architecture.
Not even close...
Posted on Reply
#160
R0H1T
So OCed 9900k (5.2GHz) vs 2700 (non X) & the biggest difference is in min FPS, that's like cherry picking a particular strain of cherry from a remote country. Do you wanna see a benchmark where AMD destroys Intel by an even higher margin? I think gaming might be difficult but for applications I can give you examples.
HenrySomeoneIrrelevant, he wanted to see at least one case of top Intel chips being up to 45% faster and I provided and besides
No I asked if it was stock vs stock, which this obviously is not!
Posted on Reply
#161
efikkan
storm-chaserI think one thing we can all agree on is we live in fascinating times. Lets just enjoy the lead out and hype, if that's what you want to call it.

Nobody should be having hurt feelings here. Tech discussion should be fun, first and foremost.
Indeed, in a few months AMD's competitiveness should at its greatest in over 10 years(for CPUs), and there should be potential for some great deals this fall. This is when the real fun begins.

That doesn't mean we should spread misinformation though.
Posted on Reply
#162
HenrySomeone
R0H1TSo OCed 9900k (5.2GHz) vs 2700 (non X) & the biggest difference is in min FPS, that's like cherry picking a particular strain of cherry from a remote country. Do you wanna see a benchmark where AMD destroys Intel by an even higher margin? I think gaming might be difficult but for applications I can give you examples.
No I asked if it was stock vs stock, which this obviously is not!
Their 2700 at the time OCed slightly better than their 2700x, Steve explained that clearly, yes with a golden sample you can get a 4,3Ghz all-core OC, but it hardly matters and besides you have your 2700X in the last graph and stock vs stock 9900k is just under 40% faster for average frames and, much more importantly, over 50% faster for 0,1% :rolleyes: I mean, Ryzen can't even hold over 60 fps at all times which i find just embarrassing and all of those are mainstream games, nothing particularly cherry-picked about that. Obviously not all games run that much better on Intel, which I've never claimed, but they exist and not just one or two. Oh and please show me that AMD winning benchmark, but it has to be at thread parity, otherwise it will just be more of that better value PR
Posted on Reply
#163
R0H1T
You said up to 45% slower, so even if we discount the massive OCing handicap ~ I'm sure you can do the math as the 2700 (non x) is not up to 45% slower in any of those results.
How about you do a re-take of that one?
HenrySomeoneOh and please show me that AMD winning benchmark, but it has to be at thread parity, otherwise it will just be more of that better value PR
Check the previous flagship MSDT 7700k vs the lowly 2400G :rolleyes:
Posted on Reply
#164
storm-chaser
HenrySomeoneTheir 2700 at the time OCed slightly better than their 2700x, Steve explained that clearly, yes with a golden sample you can get a 4,3Ghz all-core OC, but it hardly matters and besides you have your 2700X in the last graph and stock vs stock 9900k is just under 40% faster for average frames and, much more importantly, over 50% faster for 0,1% :rolleyes: I mean, Ryzen can't even hold over 60 fps at all times which i find just embarrassing and all of those are mainstream games, nothing particularly cherry-picked about that. Obviously not all games run that much better on Intel, which I've never claimed, but they exist and not just one or two. Oh and please show me that AMD winning benchmark, but it has to be at thread parity, otherwise it will just be more of that better value PR
Ignore this guy. Anti AMD troll
Posted on Reply
#165
NdMk2o1o
HenrySomeoneIrrelevant, he wanted to see at least one case of top Intel chips being up to 45% faster and I provided and besides, high refresh rate 1080p gaming actually requires the most cpu power anyway, since there's obviously less gpu bottlenecking and also, today's 1080p difference is tomorrow's 1440p

Not even close...
AMD trolling 101: Cherry pick 1 result with a unusually highly CPU dependent game that is by no means the norm then cherry pick numbers out of your butt... Got it, thanks :)
Posted on Reply
#166
bug
R0H1TSo OCed 9900k (5.2GHz) vs 2700 (non X) & the biggest difference is in min FPS, that's like cherry picking a particular strain of cherry from a remote country. Do you wanna see a benchmark where AMD destroys Intel by an even higher margin? I think gaming might be difficult but for applications I can give you examples.
No I asked if it was stock vs stock, which this obviously is not!
Tbh min frame rate is what will kill your gaming experience. But it's still cherry-picking and as I already pointed out, you're not going to buy a high-end CPU to game at FHD. At QHD and UHD the CPU is no longer a bottleneck anyway.
There are still scenarios where faster cores are needed more than additional cores, but those are the only saving grace left for the current generation of Intel CPUs. If Zen2 closes that gap, Intel is going to need Ice Lake like ultra-fast.
Posted on Reply
#167
EarthDog
bugThere are still scenarios where faster cores are needed more than additional cores, but those are the only saving grace left for the current generation of Intel CPUs.
That is most scenarios. There isnt anything a mainstream Intel (or amd) CPU cant do for 95% of people. 16c on mainstream is ridiculous today.
Posted on Reply
#169
NdMk2o1o
EarthDogThat is most scenarios. There isnt anything a mainstream Intel (or amd) CPU cant do for 95% of people. 16c on mainstream is ridiculous today.
No its not, if you don't need 16c don't buy them, I'm sure many enthusiasts and previous gen hedt users will lap them up, I honestly can't see what the issue is, so it would be OK on a different platform but because its on am4's mainstream platform its ridiculous. That makes no sense. And of course x570 has more pcie lanes and pcie4 so even more of a reason to have a higher core count processor imo the sad truth is Intel has hindered technology advancement for the last 10 years with quad cores, consoles will have and make use of 8 cores which will now start become the norm because of that and pc gamers have always been ahead of the curve when it comes to consoles.
Posted on Reply
#170
bug
EarthDogThat is most scenarios. There isnt anything a mainstream Intel (or amd) CPU cant do for 95% of people. 16c on mainstream is ridiculous today.
Amen to that.
Posted on Reply
#171
EarthDog
NdMk2o1oNo its not, if you don't need 16c don't buy them, I'm sure many enthusiasts and previous gen hedt users will lap them up, I honestly can't see what the issue is, so it would be OK on a different platform but because its on am4's mainstream platform its ridiculous. That makes no sense. And of course x570 has more pcie lanes and pcie4 so even more of a reason to have a higher core count processor imo the sad truth is Intel has hindered technology advancement for the last 10 years with quad cores, consoles will have and make use of 8 cores which will now start become the norm because of that and pc gamers have always been ahead of the curve when it comes to consoles.
LMK when software catches up... weve been waiting for 8 years.:)

Yes..it's ok on HEDT... it's a workstation platform. A distinct segment from the rest. More cores on mainstream help fewer people directly.
Posted on Reply
#172
bug
EarthDogLMK when software catches up... weve been waiting for 8 years.:)
Yeah, the age-old misunderstanding. Software already uses 2-3 orders of magnitude more threads than there are physical cores in a system. The thing is, there's just not enough load to be distributed between those threads to saturate 4 cores most of the time, let alone 16.
Core count is becoming the new MHz race. And I don't have anything against building what is essentially a better CPU overall, it's just that many people are wasting money by not corroborating what a CPU can do with their actual needs.
Posted on Reply
#173
EarthDog
bugCore count is becoming the new MHz race. And I don't have anything against building what is essentially a better CPU overall, it's just that many people are wasting money by not corroborating what a CPU can do with their actual needs.
Exactly. I dont understand the hard on most people have for more cores when they cant use it.

That said, I understand innovation and moving forward. I do appreciate the increased IPC and clocks as well as the imminent price drops we are likely to see if these perform well enough.

The real winners here, and I've said this before, are the cheap quad/hex/octo w/SMT. Much more than that, few need to care.
Posted on Reply
#174
bug
EarthDogExactly. I dont understand the hard on most people have for more cores when they cant use it.
The hard on is about getting more cores than you can use and then complaining developers are lazy and don't want to put your hardware to good use ;)
Posted on Reply
#175
efikkan
It's a common misunderstanding that multicore scaling is primarily a lack of good software. I explained this some more here.
TLDR; most real-world tasks can't scale across an arbitrary number of cores, so unless you're running more tasks or you're running more typical servers, more and more cores is only going to give you diminishing returns, and even lower performance if you at some point have to sacrifice core performance for more cores.

Single core performance is essential and will become only more important in the next years, even for those processes which uses many threads, due to the synchronization overhead. But the clockspeed race seems to be nearly over, so future gains will come from IPC increases.
Posted on Reply
Add your own comment
Mar 15th, 2025 20:00 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts