Tuesday, April 16th 2019

AMD Zen3 to Leverage 7nm+ EUV For 20% Transistor Density Increase

AMD "Zen 3" microarchitecture could be designed for the enhanced 7 nm+ EUV (extreme ultraviolet) silicon fabrication node at TSMC, which promises a significant 20 percent increase in transistor densities compared to the 7 nm DUV (deep ultraviolet) node on which its "Zen 2" processors are being built. In addition, the node will also reduce power consumption by up to 10 percent at the same operational load. In a late-2018 interview, CTO Mark Papermaster stated AMD's design goal with "Zen 3" would be to prioritize energy-efficiency, and that it would present "modest" performance improvements (read: IPC improvements) over "Zen 2." AMD made it clear that it won't drag 7 nm DUV over more than one microarchitecture (Zen 2), and that "Zen 3" will debut in 2020 on 7 nm+ EUV.
Source: PCGamesN
Add your own comment

90 Comments on AMD Zen3 to Leverage 7nm+ EUV For 20% Transistor Density Increase

#76
bug
XaledDie Intel really improved IPC since Sandy'Bridge?
Not much, but they did improve IPC until Sandy Bridge.
Posted on Reply
#77
Ascendor81
lasNope.


Better value, because performance is hit and miss. I have tried and built several Ryzen rigs by now. Performance is not good across the board, hence the price.
Sorry las. But you are wrong, MSI said cust service rep was misinformed, the Ryzen 3000 CPUs are going under evaluation for 300+400 series boards.
Posted on Reply
#78
kanecvr
lasHuh what? It's a fact. Ryzen performance is hit and miss depending on workload and especially in games (high fps gaming that is).

In tons of applications Ryzen sucks too. Handbrake to mention one.
High fps gaming is a niche market - it speaks mostly to competitive gamers - and out of all competitive games only FPS benefit from it. I've never seen a LOL, SCII or WoT player ask for a high refresh rate display, but all are adamant about Freesync or G-sync. The largest majority of my clients don't do competitive gaming and want 2k or 4k @ 60fps. Very few of them are interested in 120 or 140hz displays (under 10% of my gamer/enthusiast clients). I myself am more partial to high resolution 60fps then 1080p 140hz. The image quality in newer games (like anno 1800 for example) is staggering and 60 fps is fine if frame times are good.
Posted on Reply
#79
efikkan
notbI don't really understand what you mean by "re-spin". DDR5 won't come as surprise. CPU makers are taking part in memory development. DDR5 and appropriate CPUs have been developed together and can be launched together.
Yes, obviously all relevant parties have worked on their prototypes for years, in fact most standards are derived from prototypes, and are certainly not created in a vacuum.

But I don't get why so many people are fixated on DDR5, is there really a rush? CPU vendors will make the switch when it's needed and it's ready, not before. And based on the information I've seen, DDR5 doesn't look so good yet in terms of latency.

For mainstream users, dual channel DDR4 2666 MHz is plenty, and if you're a content creator you can always go for a quad-channel HEDT configuration. Memory bandwidth is primarily a bottleneck for heavy server loads, which is why upcoming Ice Lake-SP will feature 8 memory channels per socket. Higher memory bandwidth only really helps if a workload is bottlenecked, and over the past decade memory bandwidth have grown much faster than core speed, so generally you need heavy multithreading to be bottlenecked by memory bandwidth. As of now, DDR4 is supporting up to 3200 MHz, but I don't know if it can be expanded further.
notbMacs depend on x86 software ported from Windows. This is the reason Apple switched to Intel in the first place. They'll have to make sure every important piece of software is available for ARM. Most isn't.
Well, actually not.
The primary motivation was hardware. PPC promised to replace x86, but failed to do so. Apple needed an architecture capable of scaling from a low-power laptop to a high-end workstation, and only x86 could do that.

In terms of software, OS X have no relation to Windows or its ecosystem at all. The Darwin kernel is a mix of the mach microkernel and BSD Unix, and largely rely on the BSD ecosystem, APIs, compilers etc. while they are now increasingly drifting away from that and creating their own walled garden… Most of Apple's current software is already compatible with ARM, so that's not really a big concern.

But ARM is not close to performant enough to replace x86, and this wouldn't change anytime soon. ARM devices like iPhones and iPads rely heavily on specialized instructions to accelerate workloads, and anything else will perform like crap. If Apple switches completely to ARM it would either mean they focus solely on low-performance "web browsing devices" or a hornet's nest of software patchwork and specialized instructions to be "competitive".

But on an interesting note regarding AMD; Zen was supposed to be the steppingstone up to K12, the new big CPU architecture from AMD, based on ARM. Meanwhile K12 is MiA, and probably already obsolete compared to other ARM designs. So Zen 2, 3…5, is it the backup plan after the failure of K12? Is this why we don't hear anything about a Zen successor yet?
Posted on Reply
#80
mtcn77
efikkanBut I don't get why so many people are fixated on DDR5, is there really a rush? CPU vendors will make the switch when it's needed and it's ready, not before. And based on the information I've seen, DDR5 doesn't look so good yet in terms of latency.
Ryzen 3000 comes with bad memory cell readdressing options to improve row refresh intervals without incurring dram corruption.
Posted on Reply
#81
timta2
notbMacs depend on x86 software ported from Windows. This is the reason Apple switched to Intel in the first place. They'll have to make sure every important piece of software is available for ARM. Most isn't.

Apple could offer an ARM powered Macbook next to x86 one (just like Microsoft does with Surface).
ARM exclusive lineup? Think 2025+. But don't bet your house on it.

ARM servers are not compatible with x86 and - assuming ARM will become a more attractive option at some point (likely!) - it'll take a decade before a significant part of market migrates. And x86 will stay with us for many years.
Both Intel and AMD are thinking about the ARM threat. They'll join if necessary. Don't worry too much. :)
You clearly know nothing about Macs or macOS (Previously Mac OS X).
Posted on Reply
#82
Melvis
Did some people forget that AMD already beat the 9900K clocked at its MCE settings with a lower clocked engineering sample? Just thought id jog the memory of some Intel peeps on here that keep saying BS on this forum.

Zen3 might be what I upgrade next to from my 2700X,, all depends really on price/performance which matters more then just out right performance at like double the price cough cough
Posted on Reply
#83
notb
timta2You clearly know nothing about Macs or macOS (Previously Mac OS X).
You're free to share your knowledge and correct my mistake. Wouldn't that be more productive than mocking? :-)
Posted on Reply
#84
R0H1T
efikkanYes, obviously all relevant parties have worked on their prototypes for years, in fact most standards are derived from prototypes, and are certainly not created in a vacuum.

But I don't get why so many people are fixated on DDR5, is there really a rush? CPU vendors will make the switch when it's needed and it's ready, not before. And based on the information I've seen, DDR5 doesn't look so good yet in terms of latency.

For mainstream users, dual channel DDR4 2666 MHz is plenty, and if you're a content creator you can always go for a quad-channel HEDT configuration. Memory bandwidth is primarily a bottleneck for heavy server loads, which is why upcoming Ice Lake-SP will feature 8 memory channels per socket. Higher memory bandwidth only really helps if a workload is bottlenecked, and over the past decade memory bandwidth have grown much faster than core speed, so generally you need heavy multithreading to be bottlenecked by memory bandwidth. As of now, DDR4 is supporting up to 3200 MHz, but I don't know if it can be expanded further.


Well, actually not.
The primary motivation was hardware. PPC promised to replace x86, but failed to do so. Apple needed an architecture capable of scaling from a low-power laptop to a high-end workstation, and only x86 could do that.

In terms of software, OS X have no relation to Windows or its ecosystem at all. The Darwin kernel is a mix of the mach microkernel and BSD Unix, and largely rely on the BSD ecosystem, APIs, compilers etc. while they are now increasingly drifting away from that and creating their own walled garden… Most of Apple's current software is already compatible with ARM, so that's not really a big concern.

But ARM is not close to performant enough to replace x86, and this wouldn't change anytime soon. ARM devices like iPhones and iPads rely heavily on specialized instructions to accelerate workloads, and anything else will perform like crap. If Apple switches completely to ARM it would either mean they focus solely on low-performance "web browsing devices" or a hornet's nest of software patchwork and specialized instructions to be "competitive".

But on an interesting note regarding AMD; Zen was supposed to be the steppingstone up to K12, the new big CPU architecture from AMD, based on ARM. Meanwhile K12 is MiA, and probably already obsolete compared to other ARM designs. So Zen 2, 3…5, is it the backup plan after the failure of K12? Is this why we don't hear anything about a Zen successor yet?
Not vanilla ARM cores, but custom Axx - this myth that Intel or even AMD are so far ahead of everybody else will be broken soon enough ~

Apple's nearest competitor in this space is Apple & that was last year, so a 2 year gap with a node shrink is definitely good enough IMO to get them in their own notebooks. From there it's only Apple's ambitions that stops them.
Posted on Reply
#85
mtcn77
R0H1TNot vanilla ARM cores, but custom Axx - this myth that Intel or even AMD are so far ahead of everybody else will be broken soon enough ~

Apple's nearest competitor in this space is Apple & that was last year, so a 2 year gap with a node shrink is definitely good enough IMO to get them in their own notebooks. From there it's only Apple's ambitions that stops them.
People ask what it is to love about Qualcomm. In terms of task energy, Apple is their oyster.
Posted on Reply
#86
notb
efikkanMost of Apple's current software is already compatible with ARM, so that's not really a big concern.
Oh, that's the misunderstanding. :-)
I didn't mean Apple software. They aren't making anything critical at the moment (mostly stuff you need for everyday computing, multimedia etc).
I meant 3rd party software, which is only available on x86.
Apple's shift to Intel CPUs made OS X a feasible alternative for Windows and Linux a decade ago.

If you're from US, you may have different memories. Macs were popular in US also before.
Outside of US they were very rare. Too expensive for home users. Too useless for most professional work.

x86 was an architecture people knew how to use. It was supported by important software. It had all the useful libraries.
Moving from Windows/Linux to Mac suddenly was very easy.
I was studying back then and I remember all the Mac Pro an MacBook boxes lying around.
But ARM is not close to performant enough to replace x86, and this wouldn't change anytime soon.
But it can be frugal. That's the use case at the moment. And the cores are simpler, so you can put more of them in a package. In many ways ARM servers will target the same niche AMD does with EPYC. So if you believe EPYC makes sense, then ARM will as well.
Server scenarios built around complex, single-thread tasks are out of ARM reach and that's unlikely to change. x86 will always have the edge.
Is this why we don't hear anything about a Zen successor yet?
Assuming current Zen will be able to compete with Intel for at least 3 years and AMD may keep selling it for another 5... Zen successor could be a fairly young project.
But who knows? Chiplet idea gives them some interesting possibilities. Can they put an ASIC or FPGA there? Could they open it? Cooperate with clients who could customize the designs to fit their software?

PS5 is going to support RTRT. We don't know how. We know it's a custom Zen. Wouldn't it be nice if Sony could add a custom chiplet that does ray tracing? What if Nvidia made it?
:-)
Posted on Reply
#87
juiseman
I don't see why Apple doesn't start using AMD in addition to Intel CPU's to expand their available options.
They could use some of the new APU's from AMD. That would cut cost so they could offer a budget entry
level mac book at an affordable price. That would expand their user base greatly. Once a person gets used
to mac os and they know they can get a sub $600 laptop; they will most likely stick with it. People don't
always like change; they want something that works when they need it to...
All the Windows 10 hate lately, AMD getting competitive again in the server and consumer markets plus
the decline in their over priced iPhone markets would seem like an opportune time to take back some
of the consumer desktop/laptop share.

I think that would make more common sense than trying to port x86 to ARM; plus try to develop a CPU that
could come close to Intel, AMD current CPU's....

Everybody knows AMD cpu's can run OSX (or Mac OS) the hackntosh community has been modding
the Mach kernel since the early days of the switch to Intel. 2006-2007ish I believe.

Technically, they could just sell a version of Mac OS to run on PC's also if they wanted to.
sell it for $150-250 and make a lot of money. I think people would pay for that; instead of doing
the all the tricks to get Mac Os to run on a PC hardware...

I think they need to do something other than solely rely on IPhone sales.
That market will continue to decline; I remember when you could get a new phone for
sub $400 in the not to distant past. Now? really $1300? forget that....
I'll stick to my $60 Wal-Mart phone, that runs android and I replace every 1 1/2....
Posted on Reply
#88
bug
juisemanI don't see why Apple doesn't start using AMD in addition to Intel CPU's to expand their available options.
Apple never sources from several sources. It's one of the ingredients of their recipe to keep costs down that gives them their high margins.
Posted on Reply
#89
efikkan
notbOh, that's the misunderstanding. :)

I didn't mean Apple software. They aren't making anything critical at the moment (mostly stuff you need for everyday computing, multimedia etc).

I meant 3rd party software, which is only available on x86.
Fair enough, but the selection of commonly used productive third party applications for Apple is fairly limited, and since Apple seems to try to get out of the pro market, the selection of "required" applications will probably be very small.

BTW; don't they still own a part of Adobe?
notbx86 was an architecture people knew how to use. It was supported by important software. It had all the useful libraries.
Sure, but it's worth mentioning that most professional tools available for Macs were already available for PPC, but of course, everyone appreciated ditching PPC. :)
notbBut it can be frugal. That's the use case at the moment. And the cores are simpler, so you can put more of them in a package. In many ways ARM servers will target the same niche AMD does with EPYC. So if you believe EPYC makes sense, then ARM will as well.
Well, if you consider that ARM chips in mobile devices, TVs, Blu-Ray players etc. basically are very weak CPUs with specialized acceleration for anything which needs performance, then it's not really that efficient in generic code. Without the acceleration these CPUs would struggle to open a web browser or play a video.
ARM does fundamentally require more operations to do the same work, so it needs much higher clock speed. Also, your high TDP desktop CPU uses a lot of its die space on a much bigger prefetcher, more SIMD features etc. things which makes a huge difference when using e.g. Photoshop or Premiere.

Epyc certainly makes sense for some workloads, ARM servers doesn't really.
notbPS5 is going to support RTRT. We don't know how. We know it's a custom Zen. Wouldn't it be nice if Sony could add a custom chiplet that does ray tracing? What if Nvidia made it?
Coming from PR guys, it can probably mean just about anything, so I wouldn't read too much into it before we have real details. Light and well crafted use of raytracing is technically already possible on existing hardware using OpenCL or CUDA, so it all comes down to what they mean by "RTRT"…
Posted on Reply
#90
Super XP
ZEN 3 is looking more and more for an upgrade option.
Looking forward to more information in 2020. :)
Posted on Reply
Add your own comment
May 4th, 2024 04:17 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts