Thursday, September 12th 2019

AMD Updates Roadmaps to Lock RDNA2 and Zen 3 onto 7nm+, with 2020 Launch Window

AMD updated its technology roadmaps to reflect a 2020 launch window for its upcoming CPU and graphics architectures, "Zen 3" and RDNA2. The two will be based on 7 nm+ , which is AMD-speak for the 7 nanometer EUV silicon fabrication process at TSMC, that promises a significant 20 percent increase in transistor-densities, giving AMD high transistor budgets and more clock-speed headroom. The roadmap slides however hint that unlike the "Zen 2" and RDNA simultaneous launch on 7th July 2019, the next-generation launches may not be simultaneous.

The slide for CPU microarchitecture states that the design phase of "Zen 3" is complete, and that the microarchitecture team has already moved on to develop "Zen 4." This means AMD is now developing products that implement "Zen 3." On the other hand, RDNA2 is still in design phase. The crude x-axis on both slides that denotes year of expected shipping, too appears to suggest that "Zen 3" based products will precede RDNA2 based ones. "Zen 3" will be AMD's first response to Intel's "Comet Lake-S" or even "Ice Lake-S," if the latter comes to fruition before Computex 2020. In the run up to RDNA2, AMD will scale up RDNA a notch larger with the "Navi 12" silicon to compete with graphics cards based on NVIDIA's "TU104" silicon. "Zen 2" will receive product stack additions in the form of a new 16-core Ryzen 9-series chip later this month, and the 3rd generation Ryzen Threadripper family.
Source: Guru3D
Add your own comment

103 Comments on AMD Updates Roadmaps to Lock RDNA2 and Zen 3 onto 7nm+, with 2020 Launch Window

#51
medi01
efikkanActual expectations for Polars, Vega and Navi were all over the top.
Not for Navi. It beat expectations by quite a bit.
MephisOf course your reasoning is that because AMD is in both the next XBox and PS5 they will dictate RT.
Of course.
MephisExcept that they were in there last round and it hasn't helped them at all.
Yeah, besides semi custom revenue growing to about half of all revenue and Sony & Microsoft spending tens of millions onto AMD's R&D that goes into non-console products as well.
Oh wait, you said it had NOT helped. Interesting.
MephisMicrosoft will dictate RT, and they already have with DirectX Ray Tracing (DXR).
Lol.
MephisRTX is already compatible with it (and Vulkan's version as well) and what ever form of RT hardware AMD uses will have to be compatible with it as well. It will all come down to the design of the hardware and drivers, and at this point there is no reason to believe that AMD will have any advantage.
Oh, dear, oh dear. RTX being compatible with API that nV asked Microsoft to add, how surprising, lol.

AMD will largely influence what to put into consoles, which will or will not (depends largely on AMD) on whether it will spark RT development or not.

It will be AMD setting RT baseline and it will again be AMD influencing what performance, and what flavor of RT to push for. If they decide "nah, f*ck the Leather Man" so will it be. (see how it worked with FreeSync vs G-Sync)
Posted on Reply
#52
Mephis
medi01Yeah, besides semi custom revenue growing to about half of all revenue and Sony & Microsoft spending tens of millions onto AMD's R&D that goes into non-console products as well.
Oh wait, you said it had NOT helped. Interesting.
That's great and all, but the point I was making was that being in the consoles didn't help AMD set the direction for graphic features or performance at all.
medi01Lol.


Oh, dear, oh dear. RTX being compatible with API that nV asked Microsoft to add, how surprising, lol.

AMD will largely influence what to put into consoles, which will or will not (depends largely on AMD) on whether it will spark RT development or not.

It will be AMD setting RT baseline and it will again be AMD influencing what performance, and what flavor of RT to push for. If they decide "nah, f*ck the Leather Man" so will it be. (see how it worked with FreeSync vs G-Sync)
FreeSync's success had nothing to do with AMD being in the consoles. It was all about the fact that GSync modules added a couple of hundred dollars to monitors.

Do you really think that Microsoft is going to create a new set of Ray Tracing APIs for the consoles. No. Whatever solution AMD uses will have to be able to work with DXR.

Being in the consoles provided AMD with revenue, but it in no way has helped them gain a performance advantage over Nvidia. In fact one could argue that it has stunted there ability to develope high end GPUs, because they have had to spend so much of there engineering resources on the design and development of said console GPUs. We all have seen and heard the rumors that Navi was initially designed for the consoles first.

You are more than welcome to keep believing that AMD will set the direction of RT, but there is no historical evidence of that being the case.

† Remember that Mantel was supposed to take over the world and usher in a whole new era of performance. Not so much.
Posted on Reply
#53
Aquinus
Resident Wat-man
Mephis† Remember that Mantel was supposed to take over the world and usher in a whole new era of performance. Not so much.
You're forgetting the part where Mantle essentially became Vulkan which has had adoption by several games and projects. Vulkan is why I can use a project like DXVK to play games like Diablo 3 in Linux without a terrible performance hit. Even Doom's performance improvement using a Vulkan renderer over OpenGL was pretty significant. Same deal with DOTA, so it's not like there is no gain to be had.
MephisFreeSync's success had nothing to do with AMD being in the consoles.
FWIW, My Xbox One X plays very nicely with my 4k display and supports FreeSync. So, I wouldn't say it had nothing to do with it.
Posted on Reply
#54
Turmania
How did you lot turned this into Amd/Nvidia debate?!? I have not bought an AMD product for half a decade but they have finally gotten their act together.They are not ahead but finally caught up against their competition.Yes, in order to do so they spent their die shrink card whilst others still have it to use at will. But that is an issue for the future.look at now! Things are great we got competition.i5 9600k vs ryzen5 3600x. Now it's hard to choose.in the past it was no brainer.should you buy a 300 usd gtx 1660ti or for 40usd extra get rx 5700 which is almost 50% better performance for a little bit extra. These are good days, competition is great for us. Enjoy it.
Posted on Reply
#55
Space Lynx
Astronaut
AquinusYou're forgetting the part where Mantle essentially became Vulkan which has had adoption by several games and projects. Vulkan is why I can use a project like DXVK to play games like Diablo 3 in Linux without a terrible performance hit. Even Doom's performance improvement using a Vulkan renderer over OpenGL was pretty significant. Same deal with DOTA, so it's not like there is no gain to be had.

FWIW, My Xbox One X plays very nicely with my 4k display and supports FreeSync. So, I wouldn't say it had nothing to do with it.
and yet Navi still has issues with freesync working above 75hz according to a few Navi owners here :/ freesync and g-sync are most important to me as it really enhances the gaming experience. so until Navi can do it 100% at high refresh I have to pass
Posted on Reply
#56
Mephis
AquinusYou're forgetting the part where Mantle essentially became Vulkan which has had adoption by another games and projects. Vulkan is why I can use a project like DXVK to play games like Diablo 3 in Linux without a terrible performance hit. Even Doom's performance improvement using a Vulkan renderer over OpenGL was pretty significant. Same deal with DOTA, so it's not like there are no gain to be had.

FWIW, My Xbox One X plays very nicely with my 4k display and supports FreeSync. So, I wouldn't say it had nothing to do with it.
Yes, Mantle became Vulkan, but it didn't set the world on fire and it isn't the leading graphics API. My point is that having their GPUs in both consoles this past round hasn't allowed AMD to control the direction of graphics technology. There is no reason to think that because their GPUs are in the next round that they will be able to control the direction of Ray Tracing. We are already seeing major developers adopt DXR in their engines and their games. And it will be the API for the next XBox. AMD is not going to be able to force Nvidia to scrap all their work on RTX.

As far as Freesync and Xbox One X, yes it works great with a compatible monitor, but again it isn't the reason for there success. The majority of console gamers have their consoles hooked up to TVs. Which until very recently havent had the option for VRR of any kind. It was the monitor and PC market that set that direction.
Posted on Reply
#57
Aquinus
Resident Wat-man
MephisYes, Mantle became Vulkan, but it didn't set the world on fire and it isn't the leading graphics API.
There are more titles that support Vulkan than DX12 you know. ;)

As for RT, I think it's a fad and I'm hoping that the hype-train will run out of steam one of these days.
Posted on Reply
#58
Minus Infinity
efikkanI haven't seen any solid information whether the next iteration will be a refined Navi or their next generation. But if it turns out to be Navi 2x, as you mentioned, the scaling from 5700 to 5700 XT shows how little room there is, so a ~50% larger chip would have to be clocked lower than 5700 to retain decent efficiency. Whatever AMD launches in the Navi family, it will retain the same scaling characteristics. So for this reason, it matters if AMD launches another iteration of Navi or something which (hopefully) is a more efficient architecture. If AMD's lineup for 2020 consists of refreshed and possibly larger Navis, then it's not going to be good enough, which is why I have several times before stated that a second Navi in 2020 is just too late to the party.

Lastly, just to put it out there, a second iteration of Navi may very well be some sort of professional version too, but we have no details on that.
The slides Lisa Su presented earlier in July, showed Navi+ launching in 2020 on 7nm+ process node. Navi+ is supposedly next gen architecture, not a refinement of RDNA which is refinement of CCN. Would be disappointing if it's not Navi+ but Navi on 7nm+ with a few tweaks and higher clocks. Navi+ hopefully gets ray tracing support.

I'm torn on updating to any AMD this year. The X570 MB are pretty crap and I guess we won't see a all new designs until Zen 3 anyway. Navi is nice enough but want something more at 2080 levels. I'm torn between updating my old Ivy Bridge 3570K and GTX 1070 system to R5 3600 and Navi 5700XT or waiting. R5 4600 + 6700 XT + B650 should be sweet (I'm just assuming names here).
Posted on Reply
#59
Space Lynx
Astronaut
AquinusThere are more titles that support Vulkan than DX12 you know. ;)

As for RT, I think it's a fad and I'm hoping that the hype-train will run out of steam one of these days.
Nvidia is forcing RT down our throats for forseeable future whether we like it or not, but I agree its a fad. I honestly think a lot of games that support it look better with it off. also i prefer high frames. this is another Physx scam
Posted on Reply
#60
Aquinus
Resident Wat-man
lynx29Nvidia is forcing RT down our throats for forseeable future whether we like it or not
nVidia shoves nothing down my throat. I vote with my wallet. :laugh:

Honestly, having the best performance is secondary to upholding some of my values.
Posted on Reply
#61
TheinsanegamerN
AquinusThere are more titles that support Vulkan than DX12 you know. ;)

As for RT, I think it's a fad and I'm hoping that the hype-train will run out of steam one of these days.
It's still a drop in the roll-off of the gaming industry. Literally 99.99% of games on the market are DX, not vulkan.

I want AMD to succeed, but I'm not going to pretend they are gong to dominate a market thanks to consoles. Remember, many PC games favor Nvidia over AMD, despite being console ports, and Unreal Engine 4, one of the most widely used game engines on the market, favors Nvidia despite being optimized for consoles and used on a wide variety of AMD powered platforms.

Anyone who is writing off Nvidia's raytracing because AMD's raytracing will be in the next gen consoles is a fool. Nvidia still dominates the high end of PC gaming and has much closer ties to PC developers. Just as having 8 cores didnt magically make all games capable of taking advantage of said cores (as shown in the performance delta between the 6 and 8 core ryzen chips) AMD RT being in consoles doesnt mean it will dominate PC gaming. That will only happen if AMD can also provide the high end hardware and support needed to optimize game engines for that tech. AMD has always struggled with that last part.
Posted on Reply
#62
Mephis
AquinusThere are more titles that support Vulkan than DX12 you know. ;)

As for RT, I think it's a fad and I'm hoping that the hype-train will run out of steam one of these days.
I'll make you a gentleman's bet. Ray tracing is only going to get more prevelant, especially as gpu's get more powerful.

What we have now isn't full on ray tracing. It is a hybrid approach combining rasterazation and a little bit of ray tracing. There is no denying that full on ray tracing looks a lot better and more realistic than the best rasterazation has to offer. If you need any proof of that just look at the movies. All computer effects these day are done using ray tracing. I know we aren't close to that point yet, but if offloading processing to the cloud (Stadia and whatever MS is cooking up) then it does become a possibility in the future.
Posted on Reply
#63
biffzinker
yakkBesides possibly Apple, AMD will have the most 7nm design experience as the prices matures. In many ways probably more than Apple as they use 7nm for both CPU & GPU, exciting times ahead!
Pretty sure Apple has plenty of experience with CPUs/GPUs on TSMC's 7nm process with the A12 SoC, and now the recently announced A13. Theres a high chance the A13 is fabbed on the 7nm+.

Posted on Reply
#64
FinneousPJ
Wow, AMD only seems to be gaining momentum. I think I'll upgrade CPU & GPU next year, can't wait for reviews!
Posted on Reply
#65
1d10t
efikkanWhere is your logical reasoning here? So just because Nvidia isn't 100% perfect, we can ignore all problems from AMD? This sort of argumentation is childish.
And if you actually read my whole post, you would also see; Even Nvidia's drivers is not perfect, so there is plenty of room for improvement.
Turing launched with at least 3-4 bugs which were quickly fixed, which is a lot from Nvidia, and is probably the "worst driver launch" they've done in many years, and I've criticized it plenty for that reason. But at least their drivers are stable once the initial bugs are ironed out.
Do I have to highlighted that again? :laugh:
efikkanYes, AMD's driver quality is nowhere close to Nvidia's unfortunately.
efikkanA few cards being DoA, and a failure rate of 2-5% is completely normal, and Turing is below that. Most of the reported problems were later dismissed as driver bugs (which were also fixed), and only a few cards were actually defective.
Claiming that one particular driver is so bad while other company driver made your card dead :roll:
ParticleLuck of the draw. Bad cards do happen.
AMD/ATI cards I've had that died or quit working properly:
Radeon 9590 (dead)
Radeon X800 (dead)
Radeon X1900 XT (memory failed, visual corruption)
Radeon 5950 (UVD unit failed, system crashes)
Radeon Vega 64 (unstable, system crashes)
AMD/ATI cards I've had that work fine:
Radeon AIW 9700
Radeon 2900 Pro
Radeon 3870
Radeon 4850
Radeon 5950 (replacement)
Radeon 6970
Radeon 270
Radeon 390
Radeon VII
I run my cards at stock.
Read my previous post mate, someone made a bold claim bad driver causing card to be dud :)
lynx29from what i understand it was only 1-3% of all users who got bad cards, we don't really have any official data on how many RTX users got screwed over. also never had any experience with the RTX cards so again out of my data set... sorry you don't understand basic logic. :/
I have nothing to say, other member are already stating my point :)
If you claimed had yearly experience, why don't you use your knowledge to sort thing out?
Posted on Reply
#66
Emu
lynx29my nvidia and intel system has given me 0 issues and I own over 1000 games... /shrug AMD seems to be crashing on lots of games according to many Navi owners and gamersnexus review.
Nvidia likes to break their drivers every so often. There has been quite a few times in the 12 months or so that I have owned my 2080 ti that I have had to skip driver version because they crash in certain games. Let's also not forget the drivers that actually broke Windows completely earlier this year either.
Posted on Reply
#67
efikkan
Minus InfinityThe slides Lisa Su presented earlier in July, showed Navi+ launching in 2020 on 7nm+ process node. Navi+ is supposedly next gen architecture, not a refinement of RDNA which is refinement of CCN. Would be disappointing if it's not Navi+ but Navi on 7nm+ with a few tweaks and higher clocks.
Anything Navi will not be a new architecture, if it's a new architecture it will get a different name.
I think Lisa Su as usual is intentionally vague.
Minus InfinityNavi+ hopefully gets ray tracing support.
Navi was not designed for ray tracing. Remenber that Navi was supposed to launch early 2018, even before Turing. If a Navi family chip shows up with some kind of ray tracing support, it will be something primitive thrown in relatively late in the design process, and nothing integrated like Turing.
Minus InfinityI'm torn on updating to any AMD this year. The X570 MB are pretty crap and I guess we won't see a all new designs until Zen 3 anyway. Navi is nice enough but want something more at 2080 levels. I'm torn between updating my old Ivy Bridge 3570K and GTX 1070 system to R5 3600 and Navi 5700XT or waiting. R5 4600 + 6700 XT + B650 should be sweet (I'm just assuming names here).
If your use case is primarily gaming, I suggest to wait one more cycle, and when you upgrade you also buy something one tier up so it lasts longer. I generally recommend upgrading when you "need" more performance, and there is something significantly better available.
Posted on Reply
#68
Aquinus
Resident Wat-man
efikkanNavi was not designed for ray tracing. Remenber that Navi was supposed to launch early 2018, even before Turing. If a Navi family chip shows up with some kind of ray tracing support, it will be something primitive thrown in relatively late in the design process, and nothing integrated like Turing.
I'm not convinced that dedicated circuitry for RT is necessary to do RT well, even if that's the route nVidia went. The big issue I see with dedicated circuitry is that with most loads, that hardware is going to be inactive which is a terrible waste of die space IMHO, even more when you consider how big of a die the 2080 Ti is as 754 mm² is a huge chip.
Posted on Reply
#69
efikkan
AquinusI'm not convinced that dedicated circuitry for RT is necessary to do RT well, even if that's the route nVidia went. The big issue I see with dedicated circuitry is that with most loads, that hardware is going to be inactive which is a terrible waste of die space IMHO, even more when you consider how big of a die the 2080 Ti is as 754 mm² is a huge chip.
It depends on what degree of ray tracing you want to use. If you only want soft lighting and shadows (diffuse lighting), and can make do with few ray samples (like ~2 per 8x8 pixels or so), then yes, even current GPUs can do a decent job with OpenCL, CUDA etc. But if you want sharp reflections(specular lighting) to look good, then you need to spend 50x the performance to get a decent result, which is something even RTX can't truly do yet.

And BTW, unlike what a certain YouTuber claim, the rest of the GPU will not idle when tracing rays, it's not like one part is used for legacy games and one part for ray tracing. The RT cores will only accelerate the tracing of the rays, not give you a finished picture by themselves. The SMs will be active all the time, plus TMUs etc. too.
Posted on Reply
#70
bug
AquinusI'm not convinced that dedicated circuitry for RT is necessary to do RT well, even if that's the route nVidia went. The big issue I see with dedicated circuitry is that with most loads, that hardware is going to be inactive which is a terrible waste of die space IMHO, even more when you consider how big of a die the 2080 Ti is as 754 mm² is a huge chip.
You're probably not convinced we need a GPU for video either and shudder at the thought of the resources that go unused while browsing the Internet :D

Edit: Just to be clear, as years go by, Turing will surely go down in history as a crude RTRT implementation. That much is certain. But since no hardware we have now is particularly suited to what these cores do (BVH handling), it's is very possible their tasks cannot simply be folded onto traditional (or even enhanced) shaders, so the need for dedicated hardware will remain.
Posted on Reply
#71
Th3pwn3r
TurmaniaHow did you lot turned this into Amd/Nvidia debate?!? I have not bought an AMD product for half a decade but they have finally gotten their act together.They are not ahead but finally caught up against their competition.Yes, in order to do so they spent their die shrink card whilst others still have it to use at will. But that is an issue for the future.look at now! Things are great we got competition.i5 9600k vs ryzen5 3600x. Now it's hard to choose.in the past it was no brainer.should you buy a 300 usd gtx 1660ti or for 40usd extra get rx 5700 which is almost 50% better performance for a little bit extra. These are good days, competition is great for us. Enjoy it.
They have closed the gap but they definitely haven't caught up. Nvidia still has the fastest cards by a good clip.
Posted on Reply
#72
kapone32
If everything goes as expected the next Navi (I am sure that no one thought NAVI would beat the Vega 64) will be faster than the current iteration and than can be nothing if not good. In terms of Zen3 those chips will probably have a 5GHZ boost and a base clock somewhere in the 4.4 to 4.5 GHZ base. This is all based on what we have already seen from AMD in terms of improvement from one step to the next.
Posted on Reply
#73
r9
bugIt's not as different as you think.
Back then AMD had Athlon first (that brought the IPC fight to Intel). Then they had AthlonXP (dreamy overclocking). Then Athlon64 came (64bit and integrated memory controller). And then we had Athlon64 X2 (real multicore).
So they did have an engineering roadmap back then, too. They just left marketing and sales to chance.
Funny you should mention the real mutlicore Athlon64 X2 because this time around AMD brought the fight to Intel with a multi-die cpu :D.
Intel have a tough times ahead of them, just being able to "print" 10nm/7nm CPUs it's half of the problem the other half is either getting them to clock 5GHz or introduce better much better architecture to compensate for lower clocks while the technology is up to speed.
IMO I Ryzen 2 is a letdown in my eyes clockwise I didn't believe the 5GHz rumors but I was expecting at least 4.6GHz.
But I guess if people are happy with what Ryzen 2 offers it leaves a lot of room for improvement for Ryzen 3.
Posted on Reply
#74
medi01
Mephis...being in the consoles didn't help AMD set the direction for graphic features or performance at all.
Consoles set baseline for game developers. I don't know if you are from the same planet as I am.
MephisIt was all about the fact that GSync modules added a couple of hundred dollars to monitors.
Sure, Joh, that chip price was so high because it was really really so expensive to manufacture and not because someone is really really greedy.

AMD got its own version of Variable Refresh Rate in DisplayPort and HDMI standards. End of story.
Posted on Reply
#75
Minus Infinity
efikkanAnything Navi will not be a new architecture, if it's a new architecture it will get a different name.
I think Lisa Su as usual is intentionally vague.


Navi was not designed for ray tracing. Remenber that Navi was supposed to launch early 2018, even before Turing. If a Navi family chip shows up with some kind of ray tracing support, it will be something primitive thrown in relatively late in the design process, and nothing integrated like Turing.


If your use case is primarily gaming, I suggest to wait one more cycle, and when you upgrade you also buy something one tier up so it lasts longer. I generally recommend upgrading when you "need" more performance, and there is something significantly better available.
It's AMD's own naming, Navi+ is not the same as Navi. Navi was claiming to be all new architecture, but that's BS, it's an evolution of CCN. Navi+ is supposed to be clean slate architecture. Now I haven't heard them refute that, even if it doesn't have ray tracing support. Mavi won't compete very well against Ampere IMO, so Navi+ better be more than a refresh with some tweak and faster memory and clocks.
Posted on Reply
Add your own comment
Nov 23rd, 2024 06:36 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts