• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD to Simultaneously Launch 3rd Gen Ryzen and Unveil Radeon "Navi" This June

Do we have any official statement on Navi APU performance? Where exactly is this "nearly doubled" performance coming from?
ATM only speculative leaks (anything from premature store listings to infamous AdoredTV "leaks", pick your poison), which all point out to R5 3600G having a 20CU Navi iGPU. Even if it has no per-core performance improvements over older gen GPUs, it's still going to be as fast as 20CU RX Vega M GH (or an identical iGPU in Subor Z). In both cases it's an equivalent performance of a discrete GTX1060.

So did i last week, nice performance boost compared to my previous i7 4790K :)
I'm making a giant leap from my "temporary" i3-6100. Gonna get a nice performance boost with watercooled 1600X. Those are so cheap nowadays, I just couldn't force myself to justify yet another "temporary" R3 1200 or Athlon GE before 3000-series arrives.
 
ATM only speculative leaks (anything from premature store listings to infamous AdoredTV "leaks", pick your poison), which all point out to R5 3600G having a 20CU Navi iGPU. Even if it has no per-core performance improvements over older gen GPUs, it's still going to be as fast as 20CU RX Vega M GH (or an identical iGPU in Subor Z). In both cases it's an equivalent performance of a discrete GTX1060.
What about the power draw?
Radeon VII * 20/64 is still over 100W peak. How much more effective would Navi have to be?
Either a miracle or a serious underclock (and there goes your GTX1060 equivalent ;-))

What about die size?
Radeon VII * 20/64 is still 103mm2 - over 50% more than GPU die in 2400G.
Moreover, the chiplet in Zen2 is ~75mm2.

I'm not saying it's not possible, but very difficult for sure.

And what about the lost art of "making sense"? :-)
RX550-RX570 still sell pretty well. So even if all those customers got an APU, it would have to cost $200 to fill the revenue gap. I don't think that's the price people expect.
Clearly, I'm not a fan of how AMD does their business, but even I don't think they are that stupid.
Or are they?
Gonna get a nice performance boost with watercooled 1600X.
Honestly, why would anyone watercool a 1600X? :o
If you have to buy the cooler, it makes no economical sense at all.
Even if you already have a cooler that fits, you'll still likely be better off selling it and getting a faster CPU.
And it's not like there's some amazing OC potential in that CPU anyway.
 
What about the power draw?
Both Intel and AMD managed to do it with 20CU Vega in a sub-100W package. Remember, both Subor Z and Skull Canyon NUC are slim small form-factor PCs.

What about die size?
Once again, even the old tech process allowed to do it w/ adequate die sizes. 8809G was relatively large, but also managed to fit in a pair of HBM stacks.

And what about the lost art of "making sense"? :)
RX550-RX570 still sell pretty well.
You can't sell rebranded/rehashed/refreshed Polaris forever, and I doubt AMD has plans to manufacture those forever. Plus the market for RX500-series and Ryzen 3000 APUs is not really overlapping. If a budget-tied person wants a GPU upgrade for Haswell/Skylake or Zen 1 system for example, they'll probably lean towards getting a $150-$200 GPU rather than spending twice that on a new platform w/ equally fast APU.
 
C'mon AMD, get on top again. I'd love to relive those old Athlon XP days. AMD has not been the same since they bought ATI. ATI was beating nvidia back then too.

Their CPU's are on top, they have been capturing notable marketshare in an uphill battle for the past 2 years. Ryzen 3000 will be a decimation event.

And what about the lost art of "making sense"? :)
RX550-RX570 still sell pretty well. So even if all those customers got an APU, it would have to cost $200 to fill the revenue gap. I don't think that's the price people expect.
I am continually perplexed by the ridiculous notion that companies shouldn't make their older products obsolete. Yeah, I know Intel sold quad cores for a decade - but that's not "the norm" lol. Yeah I know Turing basically just jacked up prices, but that is because they are a near monopoly and overproduced Pascal. Don't let these companies get you so whipped you yourself stop making any sense.

Polaris is old, they are happy to get rid of it. Furthermore APU's are actually relatively cheap to make since they don't need their own PCB, GPU cooler, or usually built-on ram. It's just a cpu die and a gpu die.
 
Ryzen 3000 will be a decimation event.
AdoredTV mentioned 3000 not coming any time soon.
On top of it being at best same clock +200Mhz.

On the other hand, #moarcoars.

Oh, and I'm pretty sure Intel will sell its chips in droves, even if the wildest dreams about Ryzen 3000 come true, no worries.
 
I don´t have much faith on Navi, with AMD GPUs the hype is often exaggerated!

AMD has said very little as to what Navi will be. Most all Hype is speculation and little on true information

We know that Navi was never destined for any use as a Radeon Professional GPU. There's been different bits-n-pieces since 2016, that one of things RGT and Raja Koduri had been tasked with after Polaris, was to strip out much of the unused functions and features that basically only supported HPC and professional workloads. While this alone will not make GCN entirely efficient, that along with efficacies from Variable Rate Shading, GDDR6, while this striping "down" might actually "for once" might free refinement/fixes to the geometry engine and memory for gaming throughput. That along with 7nm might finally provide what is more the first "gaming centric" GCN architecture. If they can get between Vega 56 and 1070 Ti in 1440p, and do it say at 160W TDP they have a chance. It's not going to be all that earth shattering verse the completion, but if they can get it done for $280, it will be enough to hold out for Next-Gen "macro-architecture" sometime 2020-21.
 
Last edited:
I still have difficult time understanding how and why INTEL managed to subdued mainstream market with QUAD core for this long. I am glad to see AMD changing the course for once.
 
I still have difficult time understanding how and why INTEL managed to subdued mainstream market with QUAD core for this long. I am glad to see AMD changing the course for once.
Because, contrary to what you'll read on tech enthusiast forums, a quad core is overkill for what people do with their PCs on a daily basis. If you don't believe me, fire up task manager and see how often it gets stuck grinding at 100% ;)
 
AdoredTV mentioned 3000 not coming any time soon.
On top of it being at best same clock +200Mhz.

On the other hand, #moarcoars.

Oh, and I'm pretty sure Intel will sell its chips in droves, even if the wildest dreams about Ryzen 3000 come true, no worries.

You missed the part where he highlighted the substantially higher AVX clocks on Rome...didn't you? Or you know - the obviously impressive fact that Rome is doubling core count at the same TDP without sacrificing clockspeed. Jesus it's insane people are just ignoring that incredible engineering feat. Intel's answer is 56-cores clocked lower than their previous gen, and with insane 200w+ TDP's - it will be a bloodbath.

On the consumer side they are supposedly willing to increase the TDP to 125-135w, thus opening up room for the greater clockspeeds. But of course they are only worrying about efficiency in servers.

Because, contrary to what you'll read on tech enthusiast forums, a quad core is overkill for what people do with their PCs on a daily basis. If you don't believe me, fire up task manager and see how often it gets stuck grinding at 100% ;)

All the time if you are a content creator. All. The. Time.

You and others seem to be stuck in a logic loop about quad-cores in general. It's not that "Nothing needs more than a quad core," it's that "Nothing use to be programmed for more because there were only quad-cores." If 8 cores were the standard, games would make good use of them.
 
If 8 cores were the standard, games would make good use of them.
No they wouldn't.
Rendering itself can't scale to arbitrary number of cores, even with the best effort splitting the rendering up into more than 2-3 queues is rarely beneficial, since rendering is a pipelined synchronous workflow. While it is technically possible to have multiple threads work on assembling a single queue, it is and will remain a bad idea, since the thread synchronization would kill performance outright. The future of rendering will continue to remove the CPU as a bottleneck for rendering, not by splitting the queue assembly over more cores, but by moving more of the heavy lifting to the GPU itself.
The only benefit games have from more than 4-5 cores is for non-rendering tasks, e.g. streaming. This usually don't impact gaming performance a lot, but sometimes there are some minor gains to frame time consistency.
 
All the time if you are a content creator. All. The. Time.
Yeah, well, I'm not a content creator all the time. Just when I get back from holidays.
No they wouldn't.
Rendering itself can't scale to arbitrary number of cores, even with the best effort splitting the rendering up into more than 2-3 queues is rarely beneficial, since rendering is a pipelined synchronous workflow. While it is technically possible to have multiple threads work on assembling a single queue, it is and will remain a bad idea, since the thread synchronization would kill performance outright. The future of rendering will continue to remove the CPU as a bottleneck for rendering, not by splitting the queue assembly over more cores, but by moving more of the heavy lifting to the GPU itself.
The only benefit games have from more than 4-5 cores is for non-rendering tasks, e.g. streaming. This usually don't impact gaming performance a lot, but sometimes there are some minor gains to frame time consistency.
Why do you even bother?
 
No they wouldn't.
Rendering itself can't scale to arbitrary number of cores, even with the best effort splitting the rendering up into more than 2-3 queues is rarely beneficial, since rendering is a pipelined synchronous workflow. While it is technically possible to have multiple threads work on assembling a single queue, it is and will remain a bad idea, since the thread synchronization would kill performance outright. The future of rendering will continue to remove the CPU as a bottleneck for rendering, not by splitting the queue assembly over more cores, but by moving more of the heavy lifting to the GPU itself.
The only benefit games have from more than 4-5 cores is for non-rendering tasks, e.g. streaming. This usually don't impact gaming performance a lot, but sometimes there are some minor gains to frame time consistency.
Right because rendering geometry is all that is involved in games. sigh.

The lack of creativity some people have.

Yeah, well, I'm not a content creator all the time. Just when I get back from holidays.

Then jog on, your input isn't needed here. Some people would like to not be held back by people who think PC's are only used for Fortnight and linear shooters with the same dumb AI that has been pigeon-walking around for 10 years.

It's like some people here don't want innovation, or maybe you have forgotten it can happen? Not sure. Or actually - I seriously think some people just want to be able to keep acting like their old Quad-Cores are high-end.
 
Right because rendering geometry is all that is involved in games. sigh.

The lack of creativity some people have.



Then jog on, your input isn't needed here. Some people would like to not be held back by people who think PC's are only used for Fortnight and linear shooters with the same dumb AI that has been pigeon-walking around for 10 years.

It's like some people here don't want innovation, or maybe you have forgotten it can happen? Not sure. Or actually - I seriously think some people just want to be able to keep acting like their old Quad-Cores are high-end.
You crack me up dude. You always do.
 
How do you know that?
I see people repeating all the time, but really haven't seen any argument why.
Nvidia is making CPUs - just not x86.

They haven't made a significant change in their GPU architecture since HD7XXX series.
With no real improvements they start lacking more and more with any new generation.
Why do you think that is ?

A lot of companies make CPUs and GPUs, you know that I'm talking about PC parts.
 
Ryzen 3000 series (high end and mid range) during Computex, Radeon Navi (reveal at Computex) & Ryzen 3000 (low end) series during Summer 2019, thats what i predict, why? they have done something similar for a few years now
 
....either way this should be fun. Amd needs to release a strong gpu though because intel's is around the corner and considering the money and manpower they're throwing at their gpu....I really think they're serious this time. Once Intel establishes their gpu ...their focus will shift back to their cpus. With Amd definitely moving in the right direction, nothing but exciting times lies ahead!!! All we need is Nvdia to announce that they're making cpus.......
 
....either way this should be fun. Amd needs to release a strong gpu though because intel's is around the corner and considering the money and manpower they're throwing at their gpu....I really think they're serious this time. Once Intel establishes their gpu ...their focus will shift back to their cpus. With Amd definitely moving in the right direction, nothing but exciting times lies ahead!!! All we need is Nvdia to announce that they're making cpus.......
You're saying that as if launching one GPU is the same as establishing yourself as a player. It doesn't work like that. Intel's i740 was pretty disruptive back in the day, yet it didn't turn Intel into a GPU player.
Intel has the resources, but their GPU business will be anything but fire and forget.
 
You're saying that as if launching one GPU is the same as establishing yourself as a player. It doesn't work like that. Intel's i740 was pretty disruptive back in the day, yet it didn't turn Intel into a GPU player.
Intel has the resources, but their GPU business will be anything but fire and forget.

....given the resources intel is putting behind it, its just my belief that it will be disruptive to the mid and low tier hierarchy. I don't expect a 2080ti challenger or anything, but I do believe their presence will be felt and should not be taken lightly given Amd's philosophy and resources its shifted away from gpus to perfect thier cpus.
 
....given the resources intel is putting behind it, its just my belief that it will be disruptive to the mid and low tier hierarchy. I don't expect a 2080ti challenger or anything, but I do believe their presence will be felt and should not be taken lightly given Amd's philosophy and resources its shifted away from gpus to perfect thier cpus.
Right, from that point of view, Intel indeed can take some liberties AMD cannot afford.
 
I still have difficult time understanding how and why INTEL managed to subdued mainstream market with QUAD core for this long. I am glad to see AMD changing the course for once.

It was easy for Intel, when the alternative was almost irrelevant at the time.

When you have quad-core CPUs that are faster than the competition, you can sell them for years and people buy them.
 
Last edited:
You missed the part where he highlighted the substantially higher AVX clocks on Rome...didn't you?
I did.

Or you know - the obviously impressive fact that Rome is doubling core count at the same TDP without sacrificing clockspeed. Jesus it's insane people are just ignoring that incredible engineering feat. Intel's answer is 56-cores clocked lower than their previous gen, and with insane 200w+ TDP's - it will be a bloodbath.
Enterprise still won't switch to AMD overnight, which means Intel will continue to sell pretty damn well, even if bloodbath takes place, as far as benchmarks go.
 
This is great news. \
Though I think there may be an issue with the ZEN2 Bios Updates for all Motherboard Manufacturers. There's talk about some sort of performance regression. A Regression is ~ 5% only.
Whatever the issue, AMD needs to fix this issue or current ZEN owners may suffer a little more latency issues.


AdoredTV mentioned 3000 not coming any time soon.
On top of it being at best same clock +200Mhz.

On the other hand, #moarcoars.

Oh, and I'm pretty sure Intel will sell its chips in droves, even if the wildest dreams about Ryzen 3000 come true, no worries.
What Adored TV stated is 3000 is ready, but the x570 chipset may require a little more time for mobo manufacturers to implement. Or they are ironing out bios updates for stability and performance improvement.
 
AMD AMD all the way. If there is any reason to get excited about AMD currently is the fact that anyone who bought into the first gen knows that there is no need to buy a new MB to get the 3rd gen CPUs. As far as Navi is concerned I would love to get some official information from AMD but DDR6 should allow for some nice improvements over Polaris. We should all be excited about 7nm for AMD. As an example the first Gen R7 could not go above 4.1 GHZ the 2nd Gen goes to 4.4......so the 3rd gen may do exactly what the rumours have been saying 5.0 GHZ. If that is the case and they can increase the L1 to L3 cache among other things at 7NM there will be no reason to buy Intel just for gaming anymore.
 
Back
Top