Tuesday, March 28th 2017

AMD 16-core Ryzen a Multi-Chip Module of two "Summit Ridge" Dies

With core performance back to competitiveness, AMD is preparing to take on Intel in the HEDT "high-end desktop" segment with a new line of processors that are larger than its current socket AM4 "Summit Ridge," desktop processors, but smaller in core-count than its 32-core "Naples" enterprise processors. These could include 12-core and 16-core parts, and the picture is getting clearer with an exclusive report by Turkish tech publication DonanimHaber. The biggest revelation here that the 12-core and 16-core Ryzen processors will be multi-chip modules (MCMs) of two "Summit Ridge" dies. The 12-core variant will be carved out by disabling 1 core per CCX (3+3+3+3).

Another revelation is that the 12-core and 16-core Ryzen processors will be built in a new LGA package with pin-counts in excess of 4,000 pins. Since it's an MCM of two "Summit Ridge" dies, the memory bus width and PCIe lanes will be doubled. The chip will feature a quad-channel DDR4 memory interface, and will have a total of 58 PCI-Express gen 3.0 lanes (only one of the two dies will put out the PCI-Express 3.0 x4 A-Link chipset bus). The increase in core count isn't coming with a decrease in clock speeds. The 12-core variant will hence likely have its TDP rated at 140W, and the 16-core variant at 180W. AMD is expected to unveil these chips at the 2017 Computex expo in Taipei, this June, with product launches following shortly after.
Source: DonanimHaber (YouTube)
Add your own comment

62 Comments on AMD 16-core Ryzen a Multi-Chip Module of two "Summit Ridge" Dies

#26
bug
idxI get the feeling that few months from now all quad core CPUs will feel like core 2 Duo.

Again as I mentioned before, this is like 2007-2008 intel core 2 Duo vs core 2 Quad drama all over again. We got the same type of "thinking" arguing about how useless the extra cores are and so on.

I just don't get it why some people get so mad at you, if you mention anything about AMD or 8+ cores chip.
The reality is a 4 cores chip really has NO HOPE vs an 8+ cores chip, as soon as applications and games start making good use of them.

Plus... If a 4 cores chip is good enough for someone's use, that doesn't necessarily mean this "someone" should start a war on anyone who say " more than 4 cores ".
I don't see any drama here as I haven't seen drama before. Generic software always targets mainstream CPUs. Back then dual core was mainstream, today it's quad core. Everybody's still free to buy whatever they want, regardless.

Also, I'm not sure whether you realize it, but in the bolded part you've used the present tense in the first two sentences and the future (?) tense in the last one. 4 cores will have no hope when that happens, but that certainly is not today.
Posted on Reply
#27
idx
GasarakiAgreed. I haven't run 1080p in like 6 years.

People buying $300+ processors and $150+ motherboards to run a $150 monitor...
1080p should have died already.
Posted on Reply
#28
Steevo
I figured they would have a multiplex chipset solution to reduce the number of pins to the PCIe bus and other devices on larger chips. Four pins that are quad pumped to give 16 pins worth of bandwidth.

But I look forward to seeing how they handle saturation and what effect it has on stability and latency on die.
Posted on Reply
#29
wiyosaya
pantherx12And the other a dual socket 88 pcie lanes type deal.

I have a feeling the dual socket board will be entry level enterprise and so us regular consumers won't be seeing that.
A good search will likely reveal one available to the regular customer - should they want one. I doubt, though, that someone looking for one would fall into the regular customer category.
Posted on Reply
#30
Captain_Tom
PerfectWaveFinally decent mobo that can compete with x99. Tbh the only problem of ryzen are mobo. i dont get why ppl buy ryzen for playing at 1080p. Really a waste.
LOL is anyone?

Furthermore I think people keep forgetting that the 6 and 4-core models launch in a couple weeks. If you are making a budget 1080p build, you would be an idiot to get a $300 i7 instead of a $150 R5. That extra $150 is the difference between a high end and a midrange graphics card.

It's not like Ryzen can't do 60 FPS either lol.

P.S. If you get fast ram Ryzen meets or beats the 7700K anyways...
Posted on Reply
#31
bug
idx1080p should have died already.
Yeah, because mid-range cards stopped having problems handling everything at max details at FHD years ago. /s
Posted on Reply
#32
Captain_Tom
bugYeah, because mid-range cards stopped having problems handling everything at max details at FHD years ago. /s
I actually have to back up Bug on this one.


Don't get me wrong, I recommend 4K Freesync IPS to everyone, but at the same time that's only really because they don't cost much more than lower-spec monitors. You spend twice as much as 1080p to have vastly better color and contrast qualities, in addition to having the option of running 4K in games that are easy enough to run.

But the fact is that even the $700 1080 Ti isn't running all games at 60 FPS in 4K, and if you want 144Hz+ gaming - 1080p is still the only real option imo. (Don't make me laugh at 1440p @ 144Hz, it's harder to run than 4K @ 60).
Posted on Reply
#33
Slizzo
.....

How is running 1440P @ 144Hz harder than 4K at 60Hz? 4K is more than twice the resolution of 1440P...
Posted on Reply
#34
Captain_Tom
Slizzo.....

How is running 1440P @ 144Hz harder than 4K at 60Hz? 4K is more than twice the resolution of 1440P...
144 is over double the framerate of 60 lol
Posted on Reply
#35
Slizzo
Captain_Tom144 is over double the framerate of 60 lol
Yes, at less than half the resolution.
Posted on Reply
#36
idx
Captain_TomI actually have to back up Bug on this one.


Don't get me wrong, I recommend 4K Freesync IPS to everyone, but at the same time that's only really because they don't cost much more than lower-spec monitors. You spend twice as much as 1080p to have vastly better color and contrast qualities, in addition to having the option of running 4K in games that are easy enough to run.

But the fact is that even the $700 1080 Ti isn't running all games at 60 FPS in 4K, and if you want 144Hz+ gaming - 1080p is still the only real option imo. (Don't make me laugh at 1440p @ 144Hz, it's harder to run than 4K @ 60).
The only reason why I love 4k-5k displays is the pixels density. If the PPI is around 150-200ppi then you really don't need AA or any kind of pixel filters, turn it OFF and this will get you up to a really good frame rate overall.
I do think its much better in general to have higher screen resolution even if you don't max out the resolution in games. It is always nice to have apps, text, photos and everything on your system so clear and sharp.
Posted on Reply
#37
Unregistered
More pins for some clumsy gibbon to bend, time Amd dropped the pins like Intel. Put the pins on the cheap bit, ie the motherboard.

16 cores means more porn tabs woo hoo.
Posted on Edit | Reply
#38
bug
Slizzo.....

How is running 1440P @ 144Hz harder than 4K at 60Hz? 4K is more than twice the resolution of 1440P...
Let's see:
  • 3,840x2,160x 60=497,664,000 pixels/second to render
  • 2,560x1,440x144=530,841,600 pixels/second to render
Posted on Reply
#39
cdawall
where the hell are my stars
tiggerMore pins for some clumsy gibbon to bend, time Amd dropped the pins like Intel. Put the pins on the cheap bit, ie the motherboard.

16 cores means more porn tabs woo hoo.


Way ahead of you buddy
Posted on Reply
#40
EasyListening
This is in preparation for news that Crossfire works perfectly with no SLI-style microstutter in quad-GPU configs. Plug a 16-core Ryzen in with 4 single slot thickness Radeon Pro workstation GPU cards and now we're talking.

Nvidia abandoned multi-gpu and AMD embraced it. In the future, AMD will be releasing Vega GPUs in similar configurations as they are talking about with Ryzen. I mean, multi chip Vega and Polaris packages. Their GPU strategy for defeating Nvidia will be similar to what they did to Intel with Ryzen. Super fast core and alot of them, combined with the 14nm Samsung process, which allows AMD to run relatively high voltages through their chips due to the different way Samsung does 14nm as opposed to Intel. As Intel wastes their time trying to get more performance out of 10nm, Samsung figured out how to efficiently run more voltage through 14nm without the bad thermal performance exhibited by the simpler 14nm manufacturing process used on Kaby Lake.

I'm also expecting more support for water-cooling from AMD. I'm thinking AMD designed water blocks backed by manufacturer warranties, maybe partnering up with PC builders to sell pre-built workstations with quad-GPU configs, full liquid-cooling loop, backed by AMD warrantee, supported by the vendor. Something like that.
Posted on Reply
#41
Prima.Vera
pantherx12Mmm, more cores means more chrome tabs!
or Firefocks!
Posted on Reply
#42
Frick
Fishfaced Nincompoop
Captain_TomDon't get me wrong, I recommend 4K Freesync IPS to everyone, but at the same time that's only really because they don't cost much more than lower-spec monitors. You spend twice as much as 1080p to have vastly better color and contrast qualities, in addition to having the option of running 4K in games that are easy enough to run.
Very decent 1080p IPS monitor (not 144hz): €150. Entry level 4K IPS: €450. Which begs the question: is a so-so 4K panel inherently better than a good 1080p panel? I have no idea.
Posted on Reply
#43
HopelesslyFaithful
idxThe only reason why I love 4k-5k displays is the pixels density. If the PPI is around 150-200ppi then you really don't need AA or any kind of pixel filters, turn it OFF and this will get you up to a really good frame rate overall.
I do think its much better in general to have higher screen resolution even if you don't max out the resolution in games. It is always nice to have apps, text, photos and everything on your system so clear and sharp.
your dumb. You need AA at 4K 27 in monitor.

Your eyes can see 300+PPI at 2 feet and 700+ at 1 foot. Plus AA still makes it smoother.
FrickVery decent 1080p IPS monitor (not 144hz): €150. Entry level 4K IPS: €450. Which begs the question: is a so-so 4K panel inherently better than a good 1080p panel? I have no idea.
all depends on what you want to do.
Posted on Reply
#44
Frick
Fishfaced Nincompoop
HopelesslyFaithfulyour dumb. You need AA at 4K 27 in monitor.

Your eyes can see 300+PPI at 2 feet and 700+ at 1 foot. Plus AA still makes it smoother.



all depends on what you want to do.
1. Why the insults?
2. And avaliable money.
Posted on Reply
#45
Imsochobo
tiggerMore pins for some clumsy gibbon to bend, time Amd dropped the pins like Intel. Put the pins on the cheap bit, ie the motherboard.

16 cores means more porn tabs woo hoo.
Until the pins on the motherboard kills your cpu cause a short.
Seen it happen.

I don't support either as a "Better" solution, they both have their issues, both are equally flawed.


By the way, use Linux and chrome for porn duties.
Linux exhibits better security and Chrome runs a lot better on more cores for better porn experience.
Posted on Reply
#46
idx
HopelesslyFaithfulyour dumb. You need AA at 4K 27 in monitor.

Your eyes can see 300+PPI at 2 feet and 700+ at 1 foot. Plus AA still makes it smoother.



all depends on what you want to do.
Thanks for the insult.

And NO 200 PPI is not that bad at all if you carefully pick a good monitor (ofc the more ppi the better) you still can see the pixels but it is way better than 1080p.

4k-5k without AA is way better than 1080p with 8x MSAA or even 4xSSAA . When you apply a 4x SSAA your GPU will render stuff at 4 times the pixel density and down scale it to 1080p.

And if you are using a 4k screen without AA the load on the GPU will be similar or close to that of a 4xSS on 1080p. For an extremely thin lines or edges if your GPU can handle it you may apply something like CMAA it will do the job without adding too much load.
Posted on Reply
#47
HopelesslyFaithful
idxThanks for the insult.

And NO 200 PPI is not that bad at all if you carefully pick a good monitor (ofc the more ppi the better) you still can see the pixels but it is way better than 1080p.

4k-5k without AA is way better than 1080p with 8x MSAA or even 4xSSAA . When you apply a 4x SSAA your GPU will render stuff at 4 times the pixel density and down scale it to 1080p.

And if you are using a 4k screen without AA the load on the GPU will be similar or close to that of a 4xSS on 1080p. For an extremely thin lines or edges if your GPU can handle it you may apply something like CMAA it will do the job without adding too much load.
you said you don't need AA which is factually wrong and spreads bad info. AA still makes a significant differences especially 2x AA.

Stop trying to save face.

This gmae does really well without AA but still 2x makes a good difference
www.extremetech.com/gaming/180402-five-things-to-know-about-4k-gaming-were-glitching-our-way-to-gaming-nirvana/3

I play mostly older games and AA makes a world of a difference. I think newer games have done better at not needing AA as much.
Posted on Reply
#48
bug
HopelesslyFaithfulyou said you don't need AA which is factually wrong and spreads bad info. AA still makes a significant differences especially 2x AA.

Stop trying to save face.

This gmae does really well without AA but still 2x makes a good difference
www.extremetech.com/gaming/180402-five-things-to-know-about-4k-gaming-were-glitching-our-way-to-gaming-nirvana/3

I play mostly older games and AA makes a world of a difference. I think newer games have done better at not needing AA as much.
I was never able to see 2xMSAA's benefits. 4x is the minimum required. 2x makes a little difference that I can only see in screenshots or if I sit still in game and look for details.

And those pictures tell us squat. Of course you're going to see jaggies when you look at a 4k screenshot on a FHD monitor (you're actually zooming in). The question is: are those jaggies visible when the pixel is physically 4x smaller?
Posted on Reply
#49
infrared
HopelesslyFaithfulStop trying to save face.
No.. He has his idea, you have yours... You can discuss your opinions but he's under no obligation to change his mind because of what you've said. By saying he's dumb and stop trying to save face you're just being a d*ck. That's my opinion anyway. Continue as you please.

Edit: And back on topic, these 16 core chips sound awesome! :D The 8 core summit ridge is already running rings around my 6700k @ 4.7ghz.
Posted on Reply
#50
cdawall
where the hell are my stars
HopelesslyFaithfulyou said you don't need AA which is factually wrong and spreads bad info. AA still makes a significant differences especially 2x AA.

Stop trying to save face.

This gmae does really well without AA but still 2x makes a good difference
www.extremetech.com/gaming/180402-five-things-to-know-about-4k-gaming-were-glitching-our-way-to-gaming-nirvana/3

I play mostly older games and AA makes a world of a difference. I think newer games have done better at not needing AA as much.
He is correct you don't "need" AA, you don't "need" anything there is nothing in your life that requires a screen to be more or less clear in video games.
Posted on Reply
Add your own comment
Nov 18th, 2024 22:39 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts