# AMD Ryzen 9 3000 is a 16-core Socket AM4 Beast



## btarunr (May 9, 2019)

AMD is giving finishing touches to its 3rd generation Ryzen socket AM4 processor family which is slated for a Computex 2019 unveiling, followed by a possible E3 market availability. Based on the "Matisse" multi-chip module that combines up to two 8-core "Zen 2" chiplets with a 14 nm I/O controller die, these processors see a 50-100 percent increase in core-counts over the current generation. The Ryzen 5 series now includes 8-core/16-thread parts, the Ryzen 7 series chips are 12-core/24-thread, while the newly created Ryzen 9 series (designed to rival Intel Core i9 LGA115x), will include 16-core/32-thread chips. 

Thai PC enthusiast TUM_APISAK confirmed the existence of the Ryzen 9 series having landed himself with an engineering sample of the 16-core/32-thread chip that ticks at 3.30 GHz with 4.30 GHz Precision Boost frequency. The infamous Adored TV leaks that drew the skeleton of AMD's 3rd generation Ryzen roadmap, referenced two desktop Ryzen 9 parts, the Ryzen 9 3800X and Ryzen 9 3850X. The 3800X is supposed to be clocked at 3.90 GHz with 4.70 GHz boost, with a TDP rating of 125W, while the 3850X tops the charts at 4.30 GHz base and a staggering 5.10 GHz boost. The rated TDP has shot up to 135W. We can now imagine why some motherboard vendors are selective with BIOS updates on some of their lower-end boards. AMD is probably maximizing the clock-speed headroom of these chips out of the box, to preempt Intel's "Comet Lake" 10-core/20-thread processor.





*View at TechPowerUp Main Site*


----------



## TheLaughingMan (May 9, 2019)

Is this a confirmation that these rumors are true or simply that they exist?


----------



## DeathtoGnomes (May 9, 2019)

Wonder if this still drop in existing motherboards without much hassle other than a BIOS update.


----------



## Ebo (May 9, 2019)

Seems like weer going a very fun time ahead for us all.


----------



## R0H1T (May 9, 2019)

TheLaughingMan said:


> Is this a confirmation that these rumors are true or simply that they exist?


*We can neither confirm nor deny* these allegations this information


----------



## EarthDog (May 9, 2019)

TheLaughingMan said:


> Is this a confirmation that these rumors are true or simply that they exist?


If you look at the Twitter post, well, it could be correct, but I would still classify it as a rumor.

If you read this article, it appears it is confirmed...

If you read the Tom's article, this is still a rumor (and worded properly for the evidence we have - maybe bt knows something we don't... but I wouldn't have wrote it in such a factual manner).


----------



## dyonoctis (May 9, 2019)

F-five Ghz ? on a 16 core from Amd ??? please do not let this be an hype scheme. I'm taking this with a truck of salt, but I'm still hoping just a bit.


----------



## kapone32 (May 9, 2019)

I think it is entirely possible that we could see 5 GHZ. The 9590 was a 5 GHZ CPU from AMD. With the node shrink I don't think a 600 MHZ increase is unreasonable. After all you were lucky if you could get a 1st gen Ryzen chip to go over 4 GHZ and then the 2400G gets released and easily OCs to 4 GHZ. The 2nd Gen also saw in increase of 400MHZ for the 2700X vs the 1700X. Even though there are more cores the 2950X OCs just as high as the 2700X even though it has another CPU on the die.


----------



## GorbazTheDragon (May 9, 2019)

The context of the original AdoredTV leak is very much different than that of the ES leak. The original leak was very early in 7nm manufacturing, they were very likely target clocks more than anything definitive. This leak was already put in doubt several times, particularly the lower core count clock speeds and whether a 16c chip would actually come out early on in the release.

Secondly we do not know anything about the ES, no details as far as whether the clocks are final and which model they are associated with.

Lastly we don't actually know what the capabilities of 7nm silicon is in CPUs. Based on the difference between 14nm and 7nm Vega, I personally think it is reasonable to expect the maximum capable clocks to be higher, but again we don't know what the power will be like so we can't really know what kind of clocks to expect in the "out of box" TDP range.


----------



## R0H1T (May 9, 2019)

There's also the moving target(s) that is Cometlake & Rocketlake(?) ~ Intel Drivers Reveal 400, 495 Series Chipsets for Comet Lake, Ice Lake - New Year, New Socket, Same 14 nm Process

AMD's likely fine-tuning clocks as they did with the original Zen. If the max 1c clocks reach 4.8~5Ghz then you can bet they'll try & get the base clocks pretty high as well.


----------



## HD64G (May 9, 2019)

DeathtoGnomes said:


> Wonder if this still drop in existing motherboards without much hassle other than a BIOS update.


Most possibly selected mobos with good enough drm will handle those cpus just with a bios update. The rest will get up to the 95W ones.


----------



## Tomgang (May 9, 2019)

You smell that. Its the hype train getting really hot now. 16 cores sounds really good and if the rumored clock speeds are true as well, Ryzen 3000 chips is gonna be a pain in the ass for intel. Dont get me wrong, i am an intel man but intel needs a butt kick cause there prices are absurd these days and need some one to challence then for real now.


----------



## WikiFM (May 9, 2019)

_AMD is probably maximizing the clock-speed headroom of these chips out of the box, to preempt Intel's "Cannon Lake" 10-core/20-thread processor._

There's a typo, should be Intel's Comet Lake, Cannon Lake would never reach 10 cores hehe.

I will wait and see the IPC, since this is what matters to games, no game uses more than 8 threads anyway, so 12 and 16 cores are useless for most gamers.


----------



## Chrispy_ (May 9, 2019)

For the consumers among us who are just gaming/browsing - what's the highest frequency Zen2 chip leaked? Number of cores is irrelevant since 4-6 threads is just about all that any game engine effectively uses at the moment.


----------



## GorbazTheDragon (May 9, 2019)

Chrispy_ said:


> For the consumers among us who are just gaming/browsing - what's the highest frequency Zen2 chip leaked? Number of cores is irrelevant since 4-6 threads is just about all that any game engine effectively uses at the moment.


Having dabbled with a 5675c the last few months I think you have to add quite a lot of overhead for background tasks. Even if the individual tasks are very low load, the thread switching ends up taking a huge amount of time so having a large thread count is almost always better.


----------



## eidairaman1 (May 9, 2019)

kapone32 said:


> I think it is entirely possible that we could see 5 GHZ. The 9590 was a 5 GHZ CPU from AMD. With the node shrink I don't think a 600 MHZ increase is unreasonable. After all you were lucky if you could get a 1st gen Ryzen chip to go over 4 GHZ and then the 2400G gets released and easily OCs to 4 GHZ. The 2nd Gen also saw in increase of 400MHZ for the 2700X vs the 1700X. Even though there are more cores the 2950X OCs just as high as the 2700X even though it has another CPU on the die.



My FX 8350 does 5.0 on air, older node to boot...


----------



## TheoneandonlyMrK (May 9, 2019)

eidairaman1 said:


> My FX 8350 does 5.0 on air, older node to boot...


Mine did too , got to 5.5 on water.

It's like the past didn't exist , if Amd processors regularly hit 100°  never mind Fx's high of 90 they would be roasted in forums, yet massive power and heat are ok for team blue.

Anyway Roll on R9 day, i just hope I can afford one, think of the crunching, I would be up 20 more cores ,at the same or higher speed , for 40-60 Watts more, win.


----------



## EarthDog (May 9, 2019)

theoneandonlymrk said:


> Mine did too , got to 5.5 on water.
> 
> It's like the past didn't exist , if Amd processors regularly hit 100°  never mind Fx's high of 90 they would be roasted in forums, yet massive power and heat are ok for team blue.
> 
> Anyway Roll on R9 day, i just hope I can afford one, think of the crunching, I would be up 20 more cores ,at the same or higher speed , for 40-60 Watts more, win.


What does Bulldozer have to do with Ryzen? Clearly there is something in the arc which is limiting clock speeds. With refinements, we should see those limits go up!  But yeah, its like night and day, Ryzen from BD...

RE: Heat, again, its the arch of BD that doesn't support high temps. What does it really matter though? If it can support the temps and live a long life, it can. It doesn't matter if it tops out at 70C or 100C. There are several other factors which influence the temperature readings. I am not saying they are unrelated, but they are not tied at the hip. 

Nobody likes the increased power consumption, but, let's be clear here, heat and temperature are two different things. For example, which has the higher temperature... a lighter with a yellow flame or a bonfire with yellow flames? The answer.......they are the same temperature, but cleary a bonfire has more energy behind the temperature. I mean I have seen 5W ASICs on mining rigs burn my finger...


----------



## Zareek (May 9, 2019)

Chrispy_ said:


> For the consumers among us who are just gaming/browsing - what's the highest frequency Zen2 chip leaked? Number of cores is irrelevant since 4-6 threads is just about all that any game engine effectively uses at the moment.



Agreed but the leaks say the R9 3850X 16c/32t @ 4.3Ghz base 5.1Ghz boost will have the highest frequency.  That is followed by R7 3700X 12c/24t @ 4.2Ghz base 5Ghz boost. Personally, I'd like an 8 core with the best clocks they can pull off! The leaks on IPC are all over the place claiming anywhere from 10-29% depending on task. All of them seem to agree AMD will still lag in gaming.


----------



## Deleted member 158293 (May 9, 2019)

Competition is good!

Otherwise we'd still have 4 core CPUs at $500+ just incredible.


----------



## HimymCZe (May 9, 2019)

considering even Ryzen 2000 have better IPC than i9, 
even "just" 4,3Ghz Ryzen 3000 will EASILY beat i9. Every task. Every game. Every price.
All hail the new king.


----------



## kastriot (May 9, 2019)

I predict  here  more  than 100 comments


----------



## GoldenX (May 9, 2019)

HimymCZe said:


> considering even Ryzen 2000 have better IPC than i9,
> even "just" 4,3Ghz Ryzen 3000 will EASILY beat i9. Every task. Every game. Every price.
> All hail the new king.


We have to see if AMD solved the latency problems before that. Ryzen has better IPC, but it has a lot worse latency, that's why Intel is still king in games, even on the slower models.


----------



## TheoneandonlyMrK (May 9, 2019)

EarthDog said:


> What does Bulldozer have to do with Ryzen? Clearly there is something in the arc which is limiting clock speeds. With refinements, we should see those limits go up!  But yeah, its like night and day, Ryzen from BD...
> 
> RE: Heat, again, its the arch of BD that doesn't support high temps. What does it really matter though? If it can support the temps and live a long life, it can. It doesn't matter if it tops out at 70C or 100C. There are several other factors which influence the temperature readings. I am not saying they are unrelated, but they are not tied at the hip.
> 
> Nobody likes the increased power consumption, but, let's be clear here, heat and temperature are two different things. For example, which has the higher temperature... a lighter with a yellow flame or a bonfire with yellow flames? The answer.......they are the same temperature, but cleary a bonfire has more energy behind the temperature. I mean I have seen 5W ASICs on mining rigs burn my finger...


I think your getting the wrong end of my stick, I'm not and was never concerned.
But me and Eidairman know. We took some stick on fx for heat and power, now it's intels turn it's ok, that's what I meant.
As for 5ghz it's becoming less relevant and wasn't really that important to me ,a 60 hz ish gamer.

Oh and don't be foolish , the Ip in Bd at least some of it went into Ryzen, i find it all ironic tbh.


----------



## TheLaughingMan (May 9, 2019)

GoldenX said:


> We have to see if AMD solved the latency problems before that. Ryzen has better IPC, but it has a lot worse latency, that's why Intel is still king in games, even on the slower models.



No. It has been shown dozens of times that Intel is still king mainly because of clock speed and IPC. Whenever the clock for clock tests are done, Intel loses almost its entire gaming performance lead and drops to low single digit % leads. Latency is an issue, but not nearly the game performance killer you think.


----------



## EarthDog (May 9, 2019)

theoneandonlymrk said:


> I think your getting the wrong end of my stick, I'm not and was never concerned.
> But me and Eidairman know. We took some stick on fx for heat and power, now it's intels turn it's ok, that's what I meant.
> As for 5ghz it's becoming less relevant and wasn't really that important to me ,a 60 hz ish gamer.
> 
> Oh and don't be foolish , the Ip in Bd at least some of it went into Ryzen, i find it all ironic tbh.


I was just replying to words I saw. Perhaps I misunderstood what side of the stick you were dishing out. Lol......took some 'stick'? haha.. is that a typo?

While surely there was some influence there (BD to Ryzen), the point I was trying to make had everything to do with heat and temperatures as the difference there as to what each CPU can handle is quite different. Nobody is concerned with that because the arch can NOW handle higher temps. 


kastriot said:


> I predict  here  more  than 100 comments


I predict a useless post is useless. 



GoldenX said:


> Ryzen has better IPC,





HimymCZe said:


> considering even Ryzen 2000 have better IPC than i9,





TheLaughingMan said:


> No. It has been shown dozens of times that Intel is still king mainly because of clock speed and IPC. Whenever the clock for clock tests are done, Intel loses almost its entire gaming performance lead and drops to low single digit % leads.


Does it? Are there other reviews showing different than this one?








						4GHz CPU Battle: AMD 2nd-Gen Ryzen vs. Intel 8th-Gen Core
					

Gaming Benchmarks. We can say upfront that this article is in no way buying advice, but we're testing purely for the science of it. For the unaware, IPC (instructions...




					www.techspot.com
				




Compare apples to apples (6c/12t) and look at the 1600X at 4 GHz versus the 8700K at 4 GHz....

now... their SMT IS more efficient than Intel and in heavily threaded benchmarks which use smt. IPC measurements are typically single threaded, and smt/ht cant be involved, otherwise its a multi-threaded benchmark which shows the difference between HT and SMT.


----------



## GoldenX (May 9, 2019)

TheLaughingMan said:


> No. It has been shown dozens of times that Intel is still king mainly because of clock speed and IPC. Whenever the clock for clock tests are done, Intel loses almost its entire gaming performance lead and drops to low single digit % leads. Latency is an issue, but not nearly the game performance killer you think.


Yeah, we will see if the chiplet design solves it for good.


----------



## Dave65 (May 9, 2019)

Was there any doubt


----------



## kid41212003 (May 9, 2019)

I'm starting to regret buying the 8700K...


----------



## GoldenX (May 9, 2019)

kid41212003 said:


> I'm starting to regret buying the 8700K...


Don't, it can serve very well for many years. By the moment it's really the time to change it, AM4 will be old.


----------



## TristanX (May 9, 2019)

7nm allow for twice less power consumption, over 16 nm, so you have it - 16C drawing the same power as single Ryzen 2700X. There won't be faster clocks than 4.3 GHz, because there is additional IO chip, and raising speed will increase power consumption beyond power delivery capacity. Some may try to OC, but traditionally, Ryzens do not OC well


----------



## xorbe (May 9, 2019)

TheLaughingMan said:


> Is this a confirmation that these rumors are true or simply that they exist?


This is the big question.  We can safely assume 16/32 exists, but will they offer it, and when.


----------



## xkm1948 (May 9, 2019)

16c32t is great, but feeding that amount of cores with dual channel DDR4 with the IMC outside of compute cores? That memory sub system performance may not look good at all.


----------



## Eric3988 (May 9, 2019)

Sounds very intriguing, let the price wars commence!


----------



## Berfs1 (May 9, 2019)

It’s literally like I told y’all this ages ago lmfao... https://docs.google.com/spreadsheets/d/12WWF2RjKCXQZvV-zcDl9xAk6t3GYJuk2DdovDnoGXpA


----------



## bug (May 9, 2019)

Even if this is totally legit, feeding 16c/32t needs at least a quad-channel RAM setup. It's not going to make its way into the mainstream.
Me, I'm just curious what I can get in the $200-300 range to make me ditch my current CPU.


----------



## mtcn77 (May 9, 2019)

GorbazTheDragon said:


> Having dabbled with a 5675c the last few months I think you have to add quite a lot of overhead for background tasks. Even if the individual tasks are very low load, the thread switching ends up taking a huge amount of time so having a large thread count is almost always better.


Good chip; wrong task. L4 helps in deep cache searches, emphasizing single-thread performance. You tried to multitask, that severs cache lines by half.


----------



## Joss (May 9, 2019)

bug said:


> feeding 16c/32t needs at least a quad-channel RAM setup. It's not going to make its way into the mainstream


Maybe AMD is coming up with its own version of Intel's HEDT.


----------



## GoldenX (May 9, 2019)

Joss said:


> Maybe AMD is coming up with its own version of Intel's HEDT.


Yeah, that's the Threadripper's idea, AM4 is designed with only dual channel in mind.


----------



## XiGMAKiD (May 9, 2019)

16-core AM4 is possible and almost everyone knows it, it's the clock speed that's a bit unbelievable


----------



## Manu_PT (May 9, 2019)

GorbazTheDragon said:


> Having dabbled with a 5675c the last few months I think you have to add quite a lot of overhead for background tasks. Even if the individual tasks are very low load, the thread switching ends up taking a huge amount of time so having a large thread count is almost always better.



GamerNexus talked about that on one of his recent videos and clearly stated it does not work like that. Having stuff open in background while gaming doesnt benefit from 50 cores. Ram yes. Unlees you are encoding or rendering while playing games.



yakk said:


> Competition is good!
> 
> Otherwise we'd still have 4 core CPUs at $500+ just incredible.



Wich quad core was more than 500?? Cant remember



HimymCZe said:


> considering even Ryzen 2000 have better IPC than i9,
> even "just" 4,3Ghz Ryzen 3000 will EASILY beat i9. Every task. Every game. Every price.
> All hail the new king.



Ryzen 2000 has better ipc than coffee lake? Lol ok.



kid41212003 said:


> I'm starting to regret buying the 8700K...



If you mostly do gaming dont worry. Intel will still be faster than zen 2. You read it here first. If you are into multi threaded apps, 2700 would have been your best friend anyway.


----------



## GoldenX (May 9, 2019)

Manu_PT said:


> Wich quad core was more than 500?? Cant remember


Not much, mostly older ones: The Core 2 Quad Extreme line, the Core i7 Extreme Edition line, the i7 870 and 880.
After that, mostly USD 300-350, for_ ten_ years.


----------



## theeldest (May 9, 2019)

I think we'll still see slightly higher clocks on the final skus. We already have direct comparisons from AMD on 14nm & 7nm with Vega & the Radeon VII. From vega64 to VII clocks went up 12.2% for base and 13% for boost. That's like an 1800x going from 3.6/4.0/4.1 to 4.0/4.5/4.6 (base/turbo/xfr). That's pretty reasonable.

And the fact that it's 16 cores doesn't really matter as it's more about whether you can get 2x 8-core chips to run at 4/4.6 which is much easier.


----------



## TheoneandonlyMrK (May 9, 2019)

xkm1948 said:


> 16c32t is great, but feeding that amount of cores with dual channel DDR4 with the IMC outside of compute cores? That memory sub system performance may not look good at all.


Its been a while since it was done and even then it was off chip not on chip(mcm) but it's quite possible it's fine you know, we'll see.
It's not like other chips and systems haven't done similar.


----------



## lsevald (May 9, 2019)

Personally I want something that performs on par or better than the 9900k, both in apps and games, for half the price . Can we please have a 8c/16t Ryzen 5 "3650x" AMD? Too bad if the R5's only get the lower binned dies; 4.0/4.8 boost according to rumor, while R9 go as high as 5.1GHz.


----------



## biffzinker (May 9, 2019)

xkm1948 said:


> but feeding that amount of cores with dual channel DDR4 with the IMC outside of compute cores? That memory sub system performance may not look good at all.


A last Level 4 cache on the I/O die might be a way around that hold up.


----------



## advanced3 (May 9, 2019)

While I believe the core count, I am not buying into the Clock speeds.


----------



## bonehead123 (May 10, 2019)

Well, see, the $64MM question here is:

Now that they know that you know that they know you know, whatchaz gonna do 'bout it ?

hehehe......


----------



## lexluthermiester (May 10, 2019)

EarthDog said:


> maybe bt knows something we don't...


He might. However, even if it's just a hypothesis, he's been right many time's before. I think it'll be safe to think that's if this info isn't spot-on, it'll be very close.



advanced3 said:


> While I believe the core count, I am not buying into the Clock speeds.


Why not?


----------



## GoldenX (May 10, 2019)

advanced3 said:


> While I believe the core count, I am not buying into the Clock speeds.


No one believed Zen's IPC gain over FX, even me, and yet here we are.


----------



## Darmok N Jalad (May 10, 2019)

I still don’t see why 5.0+ isn’t possible. Zen 1 was a first try at a new architecture, and Zen 1.5 wasn’t a massive departure as much as it was some tweaks and a new node. Zen 2 is AMD’s first real refinement, and it’s on 7nm to boot. It sure seems like current Zen offerings have an architectural limit preventing higher clocks. Just a wild guess, but that may be one of the things that the chiplet design solves, as the IO, IMC, and IF components may be what was so sensitive to faster clocks. Won’t be long now to find out. I hope AMD discusses it when Zen 2 arrives.


----------



## moproblems99 (May 10, 2019)

EarthDog said:


> Nobody likes the increased power consumption, but, let's be clear here, heat and temperature are two different things. For example, which has the higher temperature... a lighter with a yellow flame or a bonfire with yellow flames? The answer.......they are the same temperature, but cleary a bonfire has more energy behind the temperature. I mean I have seen 5W ASICs on mining rigs burn my finger...



It's probably because that is what everyone shits on AMD about...But the heat, but drivers, but the temperatures.  Intel CPU gobbling up 200W more power for 5% more performance over AMD: TAKE MY MONEY!

It really is funny to watch at times.



EarthDog said:


> Does it? Are there other reviews showing different than this one?
> 
> 
> 
> ...



Why would they use the 2600X as their comparison when the 2700X was less than 5% different nearly all the time?

EDIT: Duh, thread count.  My bad.

EDIT 2:  200W more power for 5% are just general approximates, not intended to be scientifically accurate.


----------



## EarthDog (May 10, 2019)

moproblems99 said:


> It's probably because that is what everyone shits on AMD about...But the heat, but drivers, but the temperatures. Intel CPU gobbling up 200W more power for 5% more performance over AMD: TAKE MY MONEY!


Maybe their gpus. Cpus arent really an issue.

I can top out a 2700x at 4.2 ghz all C/t with a subpar 240 aio... (same with most 9900k at 5ghz). It's not temps that hold these things back. World record for a screenshot of 2700x is 6 ghz... 9900k 7.6 ghz... the arch on this process just doesnt have it in them for higher clocks regardless of temps. BD was the opposite... 8ghz+ while Intel was 7ghz+.

I really hope that 16c/32t monstrosity is for TR.... this blurring of the lines and cores wars is getting old before it really starts, lol.


----------



## TheLostSwede (May 10, 2019)

TheLaughingMan said:


> Is this a confirmation that these rumors are true or simply that they exist?



Yes


----------



## moproblems99 (May 10, 2019)

EarthDog said:


> It's not temps that hold these things back.



Not really talking about limits, thresholds, or holding back.  More of in general. 

Although, I don't believe those leaks for nothing.  If Zen2 makes it over 4.6, I'd be shocked.  Like you said, it's not thermals or power holding these back.


----------



## EarthDog (May 10, 2019)

In general, that's hilarious because outside of GPUs, I dont recall people bitching about drivers, temps, etc as it typically isnt  a limit either.....


----------



## moproblems99 (May 10, 2019)

EarthDog said:


> I really hope that 16c/32t monstrosity is for TR.... this blurring of the lines and cores wars is getting old before it really starts, lol.



Honestly, I don't really care if they put 1000 cores on it if the price is reasonable.  And by price, I mean total system price.  If they keep boards to a reasonable price with necessary features and not just adding useless features and driving price up, what is the harm?  People can just buy the cpu with the core count that fits them.  You don't have to buy the highest numbered part to say you did.

Again, saying 'you' in general not 'you' as in EarthDog.



EarthDog said:


> In general, that's hilarious because outside of GPUs, I dont recall people bitching about drivers, temps, etc as it typically isnt  a limit either.....



That is my point, temperature and power are only a concern when 'their brand' is winning in those categories.  Other wise, no one really cares.


----------



## TheLostSwede (May 10, 2019)

advanced3 said:


> While I believe the core count, I am not buying into the Clock speeds.



You should.



moproblems99 said:


> Not really talking about limits, thresholds, or holding back.  More of in general.
> 
> Although, I don't believe those leaks for nothing.  If Zen2 makes it over 4.6, I'd be shocked.  Like you said, it's not thermals or power holding these back.


Prepare to be electrocuted.


----------



## EarthDog (May 10, 2019)

I'm sure, like Intel, it will be single/two cores... it will make 4.6 single dual/core without flinching. 5 Ghz, I bet we'll see a SKU with that as well.


----------



## moproblems99 (May 10, 2019)

TheLostSwede said:


> Prepare to be electrocuted.



Static fine?


----------



## kings (May 10, 2019)

Unless an 8-core comes up that can get close to 5.0Ghz in OC, it will not be easy to get rid of my beloved 5820K@4.8Ghz.


----------



## Mamya3084 (May 10, 2019)

Is it possible that x570 adds quad channel memory support?


----------



## TheMadDutchDude (May 10, 2019)

No. It will be dual channel...


----------



## TheLostSwede (May 10, 2019)

Mamya3084 said:


> Is it possible that x570 adds quad channel memory support?



Why would it? The memory controller is in the CPU/SoC, not the chipset. Regardless, see the post above.



kings said:


> Unless an 8-core comes up that can get close to 5.0Ghz in OC, it will not be easy to get rid of my beloved 5820K@4.8Ghz.



I think you will be pleasantly surprised.


----------



## Mamya3084 (May 10, 2019)

I would have thought that some of the reserved pins could lead do additional quad channel support on x570. Not necessarily the chipset itself. Anyway, doesn't really matter.


----------



## Crackong (May 10, 2019)

kings said:


> Unless an 8-core comes up that can get close to 5.0Ghz in OC, it will not be easy to get rid of my beloved 5820K@4.8Ghz.



There is an existing product : 9900k .
If you find the 9900k's isn't enough to convince you, there must be something other than 8 cores 5GHz capability in your mind.


----------



## silapakorn (May 10, 2019)

Maybe this time I'll go back to red team after over a decade of being on blue team.


----------



## Caring1 (May 10, 2019)

3.3GHz - 4.3GHz 16 core is basically confirmed, the rest is still bullshit.


----------



## TheLostSwede (May 10, 2019)

Caring1 said:


> 3.3GHz - 4.3GHz 16 core is basically confirmed, the rest is still bullshit.



Or not.


----------



## GorbazTheDragon (May 10, 2019)

Manu_PT said:


> GamerNexus talked about that on one of his recent videos and clearly stated it does not work like that. Having stuff open in background while gaming doesnt benefit from 50 cores. Ram yes. Unlees you are encoding or rendering while playing games.


Ran 16GB DDR3 1866c9 on the 1150 system and 16GB 3200 16-17-17 on the 7700k so doesn't really match up.

The 4790k was noticeably better than the 5675c regardless, so the variables between those two are basically the same. Sure the broadwell chips are pretty rubbish clock wise but I really don't think you can chock it up to the ~10% difference in single core perf.


----------



## cucker tarlson (May 10, 2019)

TheLaughingMan said:


> No. It has been shown dozens of times that Intel is still king mainly because of clock speed and IPC. Whenever the clock for clock tests are done, Intel loses almost its entire gaming performance lead and drops to low single digit % leads. Latency is an issue, but not nearly the game performance killer you think.


no,it's still +10% in many cases with ryzen 2600x having 200mhz advantage over 8400








						Jaki procesor wybrać? Test AMD Ryzen 5 2600 vs Intel Core i5-8400 | PurePC.pl
					

Jaki procesor wybrać? Test AMD Ryzen 5 2600 vs Intel Core i5-8400 (strona 40) Test procesorów z sześcioma rdzeniami AMD Ryzen 5 1600X, AMD Ryzen 5 2600, AMD Ryzen 5 2600X i Intel Core i5-8400. Który z nich jest najszybszy i najbardziej opłacalny?




					www.purepc.pl
				











						Jaki procesor wybrać? Test AMD Ryzen 5 2600 vs Intel Core i5-8400 | PurePC.pl
					

Jaki procesor wybrać? Test AMD Ryzen 5 2600 vs Intel Core i5-8400 (strona 37) Test procesorów z sześcioma rdzeniami AMD Ryzen 5 1600X, AMD Ryzen 5 2600, AMD Ryzen 5 2600X i Intel Core i5-8400. Który z nich jest najszybszy i najbardziej opłacalny?




					www.purepc.pl
				











						Jaki procesor wybrać? Test AMD Ryzen 5 2600 vs Intel Core i5-8400 | PurePC.pl
					

Jaki procesor wybrać? Test AMD Ryzen 5 2600 vs Intel Core i5-8400 (strona 36) Test procesorów z sześcioma rdzeniami AMD Ryzen 5 1600X, AMD Ryzen 5 2600, AMD Ryzen 5 2600X i Intel Core i5-8400. Który z nich jest najszybszy i najbardziej opłacalny?




					www.purepc.pl
				











						Jaki procesor wybrać? Test AMD Ryzen 5 2600 vs Intel Core i5-8400 | PurePC.pl
					

Jaki procesor wybrać? Test AMD Ryzen 5 2600 vs Intel Core i5-8400 (strona 38) Test procesorów z sześcioma rdzeniami AMD Ryzen 5 1600X, AMD Ryzen 5 2600, AMD Ryzen 5 2600X i Intel Core i5-8400. Który z nich jest najszybszy i najbardziej opłacalny?




					www.purepc.pl
				











						Jaki procesor wybrać? Test AMD Ryzen 5 2600 vs Intel Core i5-8400 | PurePC.pl
					

Jaki procesor wybrać? Test AMD Ryzen 5 2600 vs Intel Core i5-8400 (strona 39) Test procesorów z sześcioma rdzeniami AMD Ryzen 5 1600X, AMD Ryzen 5 2600, AMD Ryzen 5 2600X i Intel Core i5-8400. Który z nich jest najszybszy i najbardziej opłacalny?




					www.purepc.pl
				











						Jaki procesor wybrać? Test AMD Ryzen 5 2600 vs Intel Core i5-8400 | PurePC.pl
					

Jaki procesor wybrać? Test AMD Ryzen 5 2600 vs Intel Core i5-8400 (strona 44) Test procesorów z sześcioma rdzeniami AMD Ryzen 5 1600X, AMD Ryzen 5 2600, AMD Ryzen 5 2600X i Intel Core i5-8400. Który z nich jest najszybszy i najbardziej opłacalny?




					www.purepc.pl


----------



## Gungar (May 10, 2019)

More fake news? Btarunr is on fire this days...



DeathtoGnomes said:


> Wonder if this still drop in existing motherboards without much hassle other than a BIOS update.



Well i did that with my 9900k with the Z370 maximus X hero, Oh boy i was wrong. I switched to Z390 aorus master, its night and day, more stable, the voltage distribution is A LOT BETTER and more efficient.


----------



## ratirt (May 10, 2019)

I think the arch for Ryzen is set. AMD is focusing on more frequency and latency increase and maybe some minor tweaks. They are pretty confident in what they are doing and they've got a plan which I believe more or less will be fulfilled. AMD is really back and you can see that with Intel's attempts to get the new node going, lowering prices, new processors releases scheduled. It is on.


----------



## medi01 (May 10, 2019)

*How to undermine a product 101: overhype it, to get dissapponted later.*

16 cores is a given, 5Ghz not at all.



eidairaman1 said:


> My FX 8350 does 5.0 on air, older node to boot...


That's getting high clocks Prescott ways.


----------



## Shatun_Bear (May 10, 2019)

The nth time that fake AdoredTV numbers have been referenced.

He made the numbers up, as they come from a chart in his video all the way back December 2018! There is no way AMD knew the base/boost and price of every Ryzen 3K SKU back at that date. Also, he listed one of the SKUs as having a _base _clock merely 100mhz short of the _boost_ clock of the Ryzen 2700x! I mean if that is not a red herring for his info, I don't know what is.


----------



## TheLostSwede (May 10, 2019)

Gungar said:


> More fake news?


Nope.



Shatun_Bear said:


> The nth time that fake AdoredTV numbers have been referenced.
> 
> He made the numbers up, as they come from a chart in his video all the way back December 2018! There is no way AMD knew the base/boost and price of every Ryzen 3K SKU back at that date. Also, he listed one of the SKUs as having a _base _clock merely 100mhz short of the _boost_ clock of the Ryzen 2700x! I mean if that is not a red herring for his info, I don't know what is.



I hope you enjoy eating crow.


----------



## FYFI13 (May 10, 2019)

Shatun_Bear said:


> The nth time that fake AdoredTV numbers have been referenced.
> 
> He made the numbers up, as they come from a chart in his video all the way back December 2018! There is no way AMD knew the base/boost and price of every Ryzen 3K SKU back at that date. Also, he listed one of the SKUs as having a _base _clock merely 100mhz short of the _boost_ clock of the Ryzen 2700x! I mean if that is not a red herring for his info, I don't know what is.


ClownTV is has been on my ignore list for long time. Remember this octopus predicting world cup results? That's exactly what ClownTV did - blind guessing.


----------



## ratirt (May 10, 2019)

Shatun_Bear said:


> The nth time that fake AdoredTV numbers have been referenced.
> 
> He made the numbers up, as they come from a chart in his video all the way back December 2018! There is no way AMD knew the base/boost and price of every Ryzen 3K SKU back at that date. Also, he listed one of the SKUs as having a _base _clock merely 100mhz short of the _boost_ clock of the Ryzen 2700x! I mean if that is not a red herring for his info, I don't know what is.


I don't agree with what you said that he faked this all. He's got information from people that can have this information. 
For the record. We are not supposed to know the price before the release but AMD company(others as well) has the price point set and their goals way before the release. It's called planning. Maybe this is something you are not familiar with in the company you work for or your own business but I'm certain AMD has it all planned and calculated.  
What I think about the rumors is they may or may not be accurate totally but there is some truth and from our, customer stand point, is either to wait for the release or look for facts and prove or disprove the rumor by talking about it and point out, by arguments, if this is possible or not. I think AMD and Intel release these rumors themselves  for a reason. This leaks make the other company compare to it's own products and mobilize for what's coming, intimidate the opponent even and make him move forward or even scare the crap outta it. Rumors are a good thing giving you a glimpse of what's coming out and if you get involved in this "thinking and measuring facts with others" you may postpone your PC purchase/upgrade now (if you need one) to see later if it was worth to wait. if you're waiting for new built/upgrade you can plan collecting some cash for this. Weather you believe the leaks or rumors it's on you. 
I think it might be true but the truth will be revealed when the zen2 is released.


----------



## TheLostSwede (May 10, 2019)

FYFI13 said:


> ClownTV is has been on my ignore list for long time. Remember this octopus predicting world cup results? That's exactly what ClownTV did - blind guessing.



Another serving of crow coming right up...


----------



## medi01 (May 10, 2019)

Shatun_Bear said:


> fake AdoredTV number


At this point it's rumored, not fake.



Shatun_Bear said:


> He made the numbers up


You can't possibly know that. When he speculates, he stats that.
For that thing he mentioned leak from insider.
His "io die" info was spot on:


----------



## Shatun_Bear (May 10, 2019)

medi01 said:


> At this point it's rumored, not fake.
> 
> 
> You can't possibly know that. When he speculates, he stats that.
> ...



No, no, no. Please listen to what he actually said - he merely speculated about there being an I/O on the chip, then said he didn't think they would do it. So again, it was a guess where he concluded the other way.

Now, it's all well saying 'but it was just speculation'  and 'some of it is subject to change' but that doesn't pass muster, frankly. The USP in that video is based on these very details that you guys say are subject to change. He listed every Ryzen 3000 SKU with exact base/boost clocks and prices, not even ranges. Now he's being called out on his figures, it's too late now to say 'but I may have been off'. He didn't mind profiting from the fervent excitement created from throwing around the 5ghz/5Ghz+ numbers back in December.

He has profited from the above hype video leak extravaganza no end by more subscribers, Patreons and clicks on his videos. People are paying this guy money on the basis he has insider info and knows what he is talking about. So ethically, there must be repercussions if he's lead everyone down the garden path regarding info that he posted online, which appears the case here. Because he is a 'video leaker' and 'tech Youtuber', it should not mean he can escape ethics and standards that a tech journo would be held to.


----------



## TheLostSwede (May 10, 2019)

Shatun_Bear said:


> No, no, no. Please listen to what he actually said - he merely speculated about there being an I/O on the chip, then said he didn't think they would do it. So again, it was a guess where he concluded the other way.
> 
> Now, it's all well saying 'but it was just speculation'  and 'some of it is subject to change' but that doesn't pass muster, frankly. The USP in that video is based on these very details that you guys say are subject to change. He listed every Ryzen 3000 SKU with exact base/boost clocks and prices, not even ranges. Now he's being called out on his figures, it's too late now to say 'but I may have been off'. He didn't mind profiting from the fervent excitement created from throwing around the 5ghz/5Ghz+ numbers back in December.
> 
> He has profited from the above hype video leak extravaganza no end by more subscribers, Patreons and clicks on his videos. People are paying this guy money on the basis he has insider info and knows what he is talking about. So ethically, there must be repercussions if he's lead everyone down the garden path regarding info that he posted online, which appears the case here. Because he is a 'video leaker' and 'tech Youtuber', it should not mean he can escape ethics and standards that a tech journo would be held to.



So where's your proof that it won't happen? You're good at bashing something, without even the least bit of insight or shred of evidence to the contrary. So please, present some fact or hold your peace.

The tech journo is dead, I should know, I used to be one, so there's no longer any standards to hold anyone to.


----------



## medi01 (May 10, 2019)

Shatun_Bear said:


> People are paying this guy money on the basis he has insider info and knows what he is talking about. So ethically, there must be repercussions if he's lead everyone down the garden path regarding info that he posted online, which appears the case here.


With that approach not even Lisa Su could give you the right figures all the time, because plans change all the time.

I agree on repercussions, but not for being wrong at times, but for outright making up shit and claiming it's insider info. (this is hard to prove though)

He did come with very unusual info that is not something that could have done out of speculations, so should have actual links.


----------



## Chrispy_ (May 10, 2019)

Zareek said:


> Agreed but the leaks say the R9 3850X 16c/32t @ 4.3Ghz base 5.1Ghz boost will have the highest frequency.



I never quite understood why the largest cores are clocked highest. Is it because the more cores there are, the more chance that one of them will reach the highest frequency, or is just the dark silicon providing more surface area to dissipate heat?

If it's the more cores increasing the chance of one of them being higher frequency, does that also not apply to the low end - in that more cores increases the likelihood that one of them is a dog and can't clock very high at all?


----------



## Shatun_Bear (May 10, 2019)

medi01 said:


> With that approach not even Lisa Su could give you the right figures all the time, because plans change all the time.



Oh come on. Lisa Su wouldn't say 'Hey guys I got a good one for you today - all the base/boost clocks of every Ryzen 3000 CPU and their prices on top' back in December 2018 and ask money for this info, because it would be made up. Certainly this info would not be privy to some guy sitting in front of his PC all day up in Scotland.


----------



## kings (May 10, 2019)

Crackong said:


> There is an existing product : 9900k .
> If you find the 9900k's isn't enough to convince you, there must be something other than 8 cores 5GHz capability in your mind.



I know the 9900K does that, but it´s a 550€ CPU in my country. I also don´t like the thermals of it.

If AMD can offer similar performance or close for ~350€, I gladly buy one.

But, we'll see, it's not that I extremely need it, I'm still quite happy with the 5820K, which has been with me since 2014.


----------



## Mysteoa (May 10, 2019)

Shatun_Bear said:


> Oh come on. Lisa Su wouldn't say 'Hey guys I got a good one for you today - all the base/boost clocks of every Ryzen 3000 CPU and their prices on top' back in December 2018 and ask money for this info, because it would be made up. Certainly this info would not be privy to some guy sitting in front of his PC all day up in Scotland.



What about the time when he leaked the RTX line up naming and that there will also be GTX line up? Many people didn't believed it, but it was true. About the IO die, he got some info that was disapproving his IO theory. If a part of a leak is not true, it doesn't make all of it false.
To me it looks like you only watch the tables and don't care what he actually says in those videos.


----------



## Super XP (May 10, 2019)

Mysteoa said:


> What about the time when he leaked the RTX line up naming and that there will also be GTX line up? Many people didn't believed it, but it was true. About the IO die, he got some info that was disapproving his IO theory. If a part of a leak is not true, it doesn't make all of it false.
> To me it looks like you only watch the tables and don't care what he actually says in those videos.


He's been more right than more wrong. In the end, its speculation and rumors. Though he does seem to have some serious sources.


----------



## TheLostSwede (May 10, 2019)

Chrispy_ said:


> I never quite understood why the largest cores are clocked highest. Is it because the more cores there are, the more chance that one of them will reach the highest frequency, or is just the dark silicon providing more surface area to dissipate heat?
> 
> If it's the more cores increasing the chance of one of them being higher frequency, does that also not apply to the low end - in that more cores increases the likelihood that one of them is a dog and can't clock very high at all?



Below is a silicon wafer, this happens to be one from Intel. The best chips are always at the centre of the wafer, these are normally "perfect" and tend to be the ones that can run at the highest speeds. The further out the edge you go, the more chance of a defect. Even if the yield is say 80%, i.e. 20% of the chips on the wafer are not good enough to make a product out of, the remaining 80% will be of varying "quality". So in this case, in the middle you have the Core i7's, then the further out you go you end up with i5's, i3's, Pentiums and Celerons. Ok, this is a bit over simplified, but it gives you a rough idea how it works. It's possible that it's a very good wafer, so there are no Pentium and Celerons, but a larger share of all the other parts. However, the CPU makers need to grade the CPUs in such a way that they can get resonable yields from each category, ideally as many of the highest category as possible, as they can charge the highest price for those cores.
Obviously this also means that by using this process of elimination, there's a bigger chance that an i7 is going to win the silicon lottery over an i5 and be that one chip in a thousand that can overclock like crazy. However, that doesn't mean Intel can sell that as a 6GHz chip, as they'd only get a dozen of those a month, so it doesn't make financial sense to stretch too far either. But as these chips are the very best of the wafer, they end up also being the ones that are clocked the highest. Obviously, if there's a flaw in one of the centre parts, say the cache doesn't fully work, then that ends up as an i5 for example and that might still be that one in a thousand chip that can hit 6GHz+ and the person who buys that got really lucky.
Does this make sense?








Shatun_Bear said:


> Oh come on. Lisa Su wouldn't say 'Hey guys I got a good one for you today - all the base/boost clocks of every Ryzen 3000 CPU and their prices on top' back in December 2018 and ask money for this info, because it would be made up. Certainly this info would not be privy to some guy sitting in front of his PC all day up in Scotland.



I think you forget something, the plan was to launch Ryzen 3000 much earlier in the year, maybe not January, but February/March. The delay is largely down to the motherboards with the X570 chipset not being ready and in fact, they're still being tuned and will be tuned until the last minute.
So yes, I do believe AMD had everything figured out at that point in time already, but had to push things back. There might be more to it than the motherboards, but it did have a huge contributing factor as to why the platform is only launching at Computex. I guess you don't really work in the industry, so you wouldn't know the first or the last things about this, but instead, you hang out on forums and spreading FUD about something you know very little about.

As to why this info was sent to Jim, I don't know. Maybe someone likes him and thought he should be in know? Kind of like when you get a free donut at dunkin' for being a regular customer.


----------



## advanced3 (May 10, 2019)

TheLostSwede said:


> Or not.





TheLostSwede said:


> Nope.
> 
> 
> 
> I hope you enjoy eating crow.


 
I think you're just an AMD super fan, you're expectations are to high. There's no way those clock speeds will happen. I guess we'll have to wait and see.


----------



## TheLostSwede (May 10, 2019)

advanced3 said:


> I think you're just an AMD super fan, you're expectations are to high. There's no way those clock speeds will happen. I guess we'll have to wait and see.



Well, you're new here, so hello. You know nothing about me, my history, where I live, what I do for a living, but you judge my insight into this topic, not very smart.

I know a lot of things I'm not at liberty to share here, but let's just say that once Ryzen 3000 launches, a lot of people are going to have to eat their own words. That said, I'm by no means saying everything AdoredTV puts in his videos will be exactly as he says, as I have no control over his content. Oh, and I expect a personal apology from you then too.

Also, it's your, not you're (that's short for you are).


----------



## ratirt (May 10, 2019)

advanced3 said:


> I think you're just an AMD super fan, you're expectations are to high. There's no way those clock speeds will happen. I guess we'll have to wait and see.


And from where you take this statement? "No way those clock speeds will happen"? Is it your own words or you quote somebody?


----------



## yeeeeman (May 10, 2019)

Lol, how many crying people here. If we remember 2 years ago, we had in the normal discrete desktop CPU market a 4 core 8 thread CPU max at ~350$. Now we have double that at ~ same price and you still aren't satisfied. I have an i7 6700HQ notebook and it is fine for at least 2-3 years more.


----------



## Metroid (May 10, 2019)

Shatun_Bear said:


> The nth time that fake AdoredTV numbers have been referenced.
> 
> He made the numbers up, as they come from a chart in his video all the way back December 2018! There is no way AMD knew the base/boost and price of every Ryzen 3K SKU back at that date. Also, he listed one of the SKUs as having a _base _clock merely 100mhz short of the _boost_ clock of the Ryzen 2700x! I mean if that is not a red herring for his info, I don't know what is.



Probably somebody from AMD, a PR maybe paid that guy to spread the info. He got pretty much everything right up to this point. He spread before everybody and if AMD wanted to contradict what he said, amd could anytime and amd did not, that tells us something. AMD needs the hype, more people knowing it, more people will not buy intel, they will hold to buy the new ryzen. I'm one of them, the only thing I see in the i9 9900k is heat and nothing more. A waste of money, i feel sorry for whoever bought it. The i7 9700k is not bad but another 14nm is a no go for me. I'm tired of old tech, mobile is at 7nm too, desktop cpus must keep up. Time for change, time to welcome ryzen 3000. A month away I reckon.


----------



## bug (May 10, 2019)

yeeeeman said:


> Lol, how many crying people here. If we remember 2 years ago, we had in the normal discrete desktop CPU market a 4 core 8 thread CPU max at ~350$. Now we have double that at ~ same price and you still aren't satisfied. I have an i7 6700HQ notebook and it is fine for at least 2-3 years more.


I don't see any crying.
I see people that dismiss this as a rumor and people that take this for granted because it hasn't been disproved.

The claimed specs themselves are a tad above what you'd expect and a little counterintuitive in part. But that by itself doesn't make them true or false.


----------



## medi01 (May 10, 2019)

Shatun_Bear said:


> ...and ask money for this info...


Come on. 
His videos, on top of being free, are at around 50k-ish views typically, I doubt one could get non-negligable income from that.


----------



## ratirt (May 10, 2019)

medi01 said:


> Come on.
> His videos, on top of being free, are at around 50k-ish views typically, I doubt one could get non-negligable income from that.


I agree. People dismiss the fact that others might be doing things because this is their hobby or they simply love tech and want to share their insight, thoughts, not because of money. If you think everyone is doing things to get money or any profits then you are hopeless and shallow. Also dismissing the fact that people and companies reach to the guy for reviews. I get it is way easier to crap on somebody and accuse of being payed for something instead giving some contribution and constructive arguments supporting their way of thinking and accusation.


----------



## TheLostSwede (May 10, 2019)

bug said:


> I don't see any crying.
> I see people that dismiss this as a rumor and people that take this for granted because it hasn't been disproved.
> 
> The claimed specs themselves are a tad above what you'd expect and a little counterintuitive in part. But that by itself doesn't make them true or false.



You're right, it looked way too good to be true to me to start with as well. As with all rumours, ample salt is needed and when his video was first mentioned here, I said as much. But I also knew some of the facts back then and as such, I knew there was some truth to the info he mentioned. Now I know much more and he's really not far off the target. Again, a lot of people are going to have to eat their words.


----------



## kapone32 (May 10, 2019)

advanced3 said:


> I think you're just an AMD super fan, you're expectations are to high. There's no way those clock speeds will happen. I guess we'll have to wait and see.



You sound like an Intel fan boy with that statement


----------



## EarthDog (May 10, 2019)

It's comical, the true fanboys don't get called out, yet, people that have at least a half a clue (in this case more, a full clue) are called fanboys. TPU members are AMAZING!


----------



## kapone32 (May 10, 2019)

I really don't understand why some people think that this is not possible. I know it is different architecture but did not the same thing happen with the FX8150 4.5GHZ max vs the FX8350  over 5 GHZ OC possible. in terms of clock speed increase. Even the 1700 could not go past 4 GHZ but the 2700 goes to 4.3? Do you not think a node shrink of that size would not bring those types of gains. The Vega 64 was 1247-1546 base and the Vega 7 is 1400 - 1750 base. A CPU has way less cores than a GPU so does it not make sense that you would see bigger boost on the CPU side.


----------



## bug (May 10, 2019)

EarthDog said:


> It's comical, the true fanboys don't get called out, yet, people that have at least a half a clue (in this case more, a full clue) are called fanboys. TPU members are AMAZING!


It's not specific to TPU, but the rule of thumb on the Internet is it's ok the be a fanboy as long as you root for the underdog, not for the top dog. Do you get that EarthDog? 



kapone32 said:


> I really don't understand why some people think that this is not possible. I know it is different architecture but did not the same thing happen with the FX8150 4.5GHZ max vs the FX8350  over 5 GHZ OC possible. in terms of clock speed increase. Even the 1700 could not go past 4 GHZ but the 2700 goes to 4.3? Do you not think a node shrink of that size would not bring those types of gains. The Vega 64 was 1247-1546 base and the Vega 7 is 1400 - 1750 base. A CPU has way less cores than a GPU so does it not make sense that you would see bigger boost on the CPU side.


While I won't say it's impossible, here are the things that make me take this with a grain of salt:
1. So many cores, clocked so high and only 135W TDP
2. Feeding that many cores from only two RAM channels.


----------



## Patriot (May 10, 2019)

Hate to be the bubble burster (not really) clock speeds are going to start to decrease from here.
Stop expecting monster clock gains every gen, the clock wars are over.  7nm EUV and 5nm will probably have lower base than 7nm.
You are just setting yourself up for disappointment.  

IPC can keep getting better with arch tweaks and Density will of course rise to a point, then die stacking will have to take over as physics wins.


----------



## R0H1T (May 10, 2019)

bug said:


> While I won't say it's impossible, here are the things that make me take this with a grain of salt:
> 1. So many cores, clocked so high and only 135W TDP
> 2. Feeding that many cores from only two RAM channels.


Same reason why Intel can go from 4 cores to 8 with the same TDP.

IF2 will deliver over twice the bandwidth as compared to previous gen, pretty sure that counts for something.








						AMD Ryzen 3000 "Zen 2" BIOS Analysis Reveals New Options for Overclocking & Tweaking
					

AMD will launch its 3rd generation Ryzen 3000 Socket AM4 desktop processors in 2019, with a product unveiling expected mid-year, likely on the sidelines of Computex 2019. AMD is keeping its promise of making these chips backwards compatible with existing Socket AM4 motherboards. To that effect...




					www.techpowerup.com
				



Feeding 16 cores isn't easy but if you look at it discreetly, there aren't many consumer applications that are bandwidth starved at the speeds Zen2 will run it's memory.




























https://www.tomshardware.com/reviews/best-ram-speed-x470-pinnacle-ridge,6064-6.html


----------



## TheLostSwede (May 10, 2019)

bug said:


> While I won't say it's impossible, here are the things that make me take this with a grain of salt:
> 1. So many cores, clocked so high and only 135W TDP
> 2. Feeding that many cores from only two RAM channels.



No-one said memory performance was up, did they? If so, then I have missed that.
As for the TDP, I guess we'll see, but a Threadripper 1950X is 180W and has just as many cores, but at 14nm, so it doesn't seem impossible.
Salt is always good, just don't add too much, as that's not good for health... Personally I prefer ammonium chloride, although only in my liquorice. 



Patriot said:


> Hate to be the bubble burster (not really) clock speeds are going to start to decrease from here.
> Stop expecting monster clock gains every gen, the clock wars are over.  7nm EUV and 5nm will probably have lower base than 7nm.
> You are just setting yourself up for disappointment.
> 
> IPC can keep getting better with arch tweaks and Density will of course rise to a point, then die stacking will have to take over as physics wins.



You're most likely right, this will be one of the final pushes in clock speeds for current technology. I doubt we'll see retail chips based on silicon running above 6GHz with typical PC cooling. This is obviously why we're going wider instead, at least for the time being, but it's a limit of how wide you can go until you hit diminish returns as well. At least until the software catches up and can make much better use of multiple threads.


----------



## The Lighthouse (May 10, 2019)

I have a dreaded feeling that AMD may have been just abandoned Threadripper, not just merely delaying it into 2020.

If any chance X570 have quad channels then it is safe to assume that AMD is no longer interested in Threadripper, instead focusing on Eypc and Ryzen.


----------



## M2B (May 10, 2019)

bug said:


> Even if this is totally legit, feeding 16c/32t needs at least a quad-channel RAM setup. It's not going to make its way into the mainstream.
> Me, I'm just curious what I can get in the $200-300 range to make me ditch my current CPU.



It's the matter of workload and also memory support will be improved.
Assuming that the X570 boards may be supporting up to 4000MHz of speed, that's still plenty of bandwidth for plenty of workloads.
And even if you're memory bandwidth bound in a specific application, it's not like it wouldn't perform any better than an 8 core, for example.


----------



## efikkan (May 10, 2019)

Darmok N Jalad said:


> I still don’t see why 5.0+ isn’t possible. Zen 1 was a first try at a new architecture, and Zen 1.5 wasn’t a massive departure as much as it was some tweaks and a new node. Zen 2 is AMD’s first real refinement, and it’s on 7nm to boot. It sure seems like current Zen offerings have an architectural limit preventing higher clocks. Just a wild guess, but that may be one of the things that the chiplet design solves, as the IO, IMC, and IF components may be what was so sensitive to faster clocks. Won’t be long now to find out. I hope AMD discusses it when Zen 2 arrives.


I'm not saying 5 GHz is impossible, but as of now we have no evidence supporting it will happen, and it will be challenging to achieve with 7nm DUV. Surprisingly most people seem to think Zen 2 will hit 5 GHz with massive core counts, and it's all based on speculation falsely portrayed as "leaks".

Generally speaking, higher clocks require higher voltage, and rises rapidly once you get outside of the "sweetspot" of the node, and every 100 MHz above that gets exponentially harder. For TSMC 7nm DUV to easily hit 5 GHz, it would need to be better than expected.

Also, chips on this node will effectively be ~2x denser, which means that unless the power usage is more than cut in half, the heat per mm² will actually increase. CPUs like i9-9900K which hits 5 GHz is about 174mm²(including IGP), and it certainly runs into issues due to heat.

So just saying; manage your expectations.



M2B said:


> Assuming that the X570 boards may be supporting up to 4000MHz of speed, that's still plenty of bandwidth for plenty of workloads.
> And even if you're memory bandwidth bound in a specific application, it's not like it wouldn't perform any better than an 8 core, for example.


Last time I checked DDR4 supported up to 3200 MHz JEDEC speeds.


----------



## TheLostSwede (May 10, 2019)

The Lighthouse said:


> I have a dreaded feeling that AMD may have been just abandoned Threadripper, not just merely delaying it into 2020.
> 
> If any chance X570 have quad channels then it is safe to assume that AMD is no longer interested in Threadripper, instead focusing on Eypc and Ryzen.



It's expected later this year.

X570 will be dual channel. Well, technically the X570 doesn't have memory channels, as they're in the CPU/SoC, but the boards will only support dual channel configurations.



efikkan said:


> I'm not saying 5 GHz is impossible, but as of now we have no evidence supporting it will happen, and it will be challenging to achieve with 7nm DUV. Surprisingly most people seem to think Zen 2 will hit 5 GHz with massive core counts, and it's all based on speculation falsely portrayed as "leaks".
> 
> Generally speaking, higher clocks require higher voltage, and rises rapidly once you get outside of the "sweetspot" of the node, and every 100 MHz above that gets exponentially harder. For TSMC 7nm DUV to easily hit 5 GHz, it would need to be better than expected.
> 
> ...



You are aware that some people actually work in the industry and gets this information ahead of product launches, right?
Just because you feel that it's speculation, doesn't mean that is the case.
If anything your post is a bunch of speculation about something you have zero insight into.


----------



## Aerpoweron (May 10, 2019)

I think we will see soon enough what AMD will put out with the Ryzen 3000 series. It will be interesting to see how well the chiplet design works considering memory latency.

In my opinion it is more and more relevant what you need the processor for. Examine your use-case and then buy the processor which fits it best.

But a disadvantage of a smaller production process is the heat generated in a smaller area. So heat transfer out of the hot zones will be quite challenging. There is even some research in boron-nitrite, which should have twice the heat transfer capability as copper. But even if it works as promised, it will take years until it can be useful in chip production.

I have to agree, that the core clocks get down the more cores you have (active) and it will be more challenging for programmers to utilize them properly. Nowerdays, at least in Germany, not many programmers can do multicore programming   And then you have the scheduler from the OS which might trow threads around like crazy. So slowing it down the thread. It might be a good idea from a heat perspective.  Can you imagine one core out of 16 running at 6GHz and drawing all the 135W. Thermal stresses might ruing the CPU quickly.


----------



## TheLostSwede (May 10, 2019)

efikkan said:


> Last time I checked DDR4 supported up to 3200 MHz JEDEC speeds.



So what's this then, fake news? https://www.corsair.com/us/en/Categories/Products/Memory/vengeance-lpx-black/p/CMK16GX4M2K4700C19


----------



## bug (May 10, 2019)

M2B said:


> It's the matter of workload and also memory support will be improved.
> Assuming that the X570 boards may be supporting up to 4000MHz of speed, that's still plenty of bandwidth for plenty of workloads.
> And even if you're memory bandwidth bound in a specific application, it's not like it wouldn't perform any better than an 8 core, for example.


True, "shove many cores in there so no one will notice some are just waiting to be fed" is a viable strategy, too.


----------



## Aerpoweron (May 10, 2019)

Check out the SPD speed of the memory, which is DDR4 2133






						VENGEANCE® LPX 16GB (2 x 8GB) DDR4 DRAM 4700MHz C19 Memory Kit - Black
					

VENGEANCE LPX memory is designed for high-performance overclocking. The heatspreader is made of pure aluminum for faster heat dissipation, and the eight-layer PCB helps manage heat and provides superior overclocking headroom.




					www.corsair.com
				




The 4700MHz refers to XMP profiles, which is not a JEDEC specified speed.


----------



## efikkan (May 10, 2019)

TheLostSwede said:


> You are aware that some people actually work in the industry and gets this information ahead of product launches, right?
> Just because you feel that it's speculation, doesn't mean that is the case.
> If anything your post is a bunch of speculation about something you have zero insight into.


Please be serious.
There is nothing wrong in speculating when labeled as such, but what's wrong is portraying speculation as leaked facts, which many opinionators on YouTube, sites like Wccftech and Videocardz etc. does.

Final clocks and prices are not set until the final stepping has gone through qualification, which usually happens weeks before launch. Anyone who claims to know these details >6 months ahead is lying, regardless of who they claim to be. You can speculate all you want, but you can never know a fact before it exists.



TheLostSwede said:


> So what's this then, fake news? https://www.corsair.com/us/en/Categories/Products/Memory/vengeance-lpx-black/p/CMK16GX4M2K4700C19


Ever heard of XMP? That's the overclocked speed. I'm sorry, but I thought this was common knowledge.


----------



## TheLostSwede (May 10, 2019)

Aerpoweron said:


> Check out the SPD speed of the memory, which is DDR4 2133
> 
> 
> 
> ...



And this is relevant how? Does this mean that the memory can't run faster than 2133MHz?



efikkan said:


> Please be serious.
> There is nothing wrong in speculating when labeled as such, but what's wrong is portraying speculation as leaked facts, which many opinionators on YouTube, sites like Wccftech and Videocardz etc. does.
> 
> Final clocks and prices are not set until the final stepping has gone through qualification, which usually happens weeks before launch. Anyone who claims to know these details >6 months ahead is lying, regardless of who they claim to be. You can speculate all you want, but you can never know a fact before it exists.
> ...



I am being serious. I also know things you don't know, but clearly everyone else is wrong, but you.
You're so stubborn you can't even listen to other people or try to read between the lines.
Let me spell it out for you, YOU ARE WRONG.


----------



## bug (May 10, 2019)

TheLostSwede said:


> I am being serious. I also know things you don't know, but clearly everyone else is wrong, but you.
> You're so stubborn you can't even listen to other people or try to read between the lines.
> Let me spell it out for you, YOU ARE WRONG.


Look, all he's saying is while you may have additional information, the rest of us don't. We only know that some guy posted something, somewhere. He's been right before and so has the horoscope. What should we make of this?


----------



## TheLostSwede (May 10, 2019)

bug said:


> Look, all he's saying is while you may have additional information, the rest of us don't. We only know that some guy posted something, somewhere.



Exactly. But who then gives you the right to go and call that person a liar and a cheat? Feel free to take it with a boatload of salt, say that you don't believe it's real, but you can't call someone a liar if you don't have the facts to prove them wrong. This is in part with what's wrong with the internet.
I'm sorry I can't share the information I have, but I've been trying to tell all the naysayers in this thread that he's not way off, in fact the details he released in December are very close to what will launch. I don't understand why that is so impossible to comprehend.


----------



## storm-chaser (May 10, 2019)

I'm sure AMD has a couple tricks up its sleeve. 

I for one would be excited to see another 5.0Ghz piece from AMD.

They've done it before, so it's not like this isn't without precedent.


----------



## Aerpoweron (May 10, 2019)

TheLostSwede said:


> And this is relevant how? Does this mean that the memory can't run faster than 2133MHz?



The 2133 MHz is the JEDEC specification it will run guaranteed with this speed if the CPU officially supports it.
The XMP settings are overclocking settings. It might likely work, but it is not guaranteed.

I have a i7 4790k which can handle one DDR3 2400MHz memory, but not another one. Strange crashes, even corrupt data appears. If you want to use such high speed memory in production environment, you have to to excessive testing.

What i find a little strange about the Corsair memory, they don't support the JEDEC 2666MHz specification, which i would assume should be ok at 4700MHz overclocked.

For JEDEC specified Memory is the highest 2666MHz DDR4 what i have seen for consumer computers. For servers there is already 2933MHz JEDEC specified available.


----------



## bug (May 10, 2019)

TheLostSwede said:


> Exactly. But who then gives you the right to go and call that person a liar and a cheat?


Was the internet invented so we can call people names?


TheLostSwede said:


> I'm sorry I can't share the information I have, but I've been trying to tell all the naysayers in this thread that he's not way off, in fact the details he released in December are very close to what will launch. I don't understand why that is so impossible to comprehend.


You have to realize that, to us, you're still just a guy posting on the internet. Right as you may be, we can't verify your claims anymore than we can verify the supposed leak. I, for one, believe you. But that's my personal choice, don't be surprised when others don't.



Aerpoweron said:


> The 2133 MHz is the JEDEC specification it will run guarantied with this speed if the CPU officially supports it.
> The XMP settings are overclocking settings. It might likely work, but it is not guarantied.
> 
> I have a i7 4790k which can handle one DDR3 2400MHz memory, but not another one. Strange crashes, even corrupt data appears. If you want to use such high speed memory in production environment, you have to to excessive testing.
> ...


Oh, XMP will work, that much is guaranteed. You just need to pair the sticks with a mobo that can also run them at those speeds.


----------



## TheLostSwede (May 10, 2019)

Aerpoweron said:


> The 2133 MHz is the JEDEC specification it will run guarantied with this speed if the CPU officially supports it.
> The XMP settings are overclocking settings. It might likely work, but it is not guarantied.
> 
> I have a i7 4790k which can handle one DDR3 2400MHz memory, but not another one. Strange crashes, even corrupt data appears. If you want to use such high speed memory in production environment, you have to to excessive testing.
> ...



Again, what does have to do with anything being discussed here?
We all know AMD has had memory controller issues and I don't know of they've resolved them or not. 
What I do know is that we can expect support for faster memory and it's outside of JEDEC spec.

That was just a random stick of RAM I pulled to prove my point, that memory speed isn't limited to JEDEC spec and hasn't been for years. 
Shit I remember testing the first PC133 memory on a VIA board and it didn't work for shit. It wasn't JEDEC spec at the time.


----------



## M2B (May 10, 2019)

efikkan said:


> Ever heard of XMP? That's the overclocked speed. I'm sorry, but I thought this was common knowledge.



Who the hell cares about the JEDEC standards? 
Overclocking the memory is as easy as drinking water.


----------



## TheLostSwede (May 10, 2019)

bug said:


> Was the internet invented so we can call people names?
> 
> You have to realize that, to us, you're still just a guy posting on the internet. Right as you may be, we can't verify your claims anymore than we can verify the supposed leak. I, for one, believe you. But that's my personal choice, don't be surprised when others don't.
> 
> ...



Some days it feels like it.

Oh, I have, trust me. Look at when I joined here though. I've known the owner of this site since before it started. In fact, I told him I thought the name was a bit pants...

Isn't that kind of the point of leaks, they can't really be verified. Again, not surprised, but I don't get the continuous counter arguments when someone is trying to let you know that they know something you don't and maybe you should just sit this one out.
I'm trying my best to share some details here, without getting people in trouble. At least I know what I will be spending my money on later this year.


----------



## Deleted member 158293 (May 10, 2019)

Dizzying back&forth here... 

Just a guess, but if even the next Ryzen generation will be a chipset design,  then brand differentiation with Threadripper becomes just pci lanes & memory channels.  Not the CPU design itself.

I can see AMD repositioning their successful "Threadripper" branding differently for maximum impact, which might cause it to disappear, at least for a while until needed again.


----------



## Aerpoweron (May 10, 2019)

TheLostSwede said:


> Again, what does have to do with anything being discussed here?
> We all know AMD has had memory controller issues and I don't know of they've resolved them or not.
> What I do know is that we can expect support for faster memory and it's outside of JEDEC spec.
> 
> ...



My apologies, misunderstanding on my side.

Back to the discussion then. Are there official memory supported speeds for the Ryzen 3000 series? As far as i have seen AMD was always a little ahead of Intel concerning officially supported memory speeds. My guess would be DDR4 3200 support since the Ryzen 2000s have 2933 support.


----------



## efikkan (May 10, 2019)

TheLostSwede said:


> Exactly. But who then gives you the right to go and call that person a liar and a cheat? Feel free to take it with a boatload of salt, say that you don't believe it's real, but you can't call someone a liar if you don't have the facts to prove them wrong. This is in part with what's wrong with the internet.
> 
> I'm sorry I can't share the information I have, but I've been trying to tell all the naysayers in this thread that he's not way off, in fact the details he released in December are very close to what will launch. I don't understand why that is so impossible to comprehend.


I'm going to give you a last chance to read my posts and apologize for your adolescent behavior. Even if I were wrong, there is no excuse for behaving like this, and if I were, any grown up should be able to make a serious argument for their case.

We're not going to descend into a philosophical discussion about _truth_ really is here, but any reasonable person understands the distinction between speculation and fact. Just like no-one can know the outcome of a soccer match before it's played, but do a qualified guess, and even be right, it still doesn't change the fact that the result was not known in advance. 

Please keep this discussion on topic from now on.

Until the final specs are actually set, AMD, Nvidia and Intel operate with targets, and these are ranges, not specific clock speeds and exact prices, since this depends on the final yields. This is set after the final stepping has gone through qualification. There are many leaks that can contain true information, like bigger architectural changes etc., but no-one can know the final clocks/price etc. until they are set, and you can use this knowledge to discredit many false "leaks". False leaks will accelerate disappointment when they eventually come crushing down.



bug said:


> Oh, XMP will work, that much is guaranteed. You just need to pair the sticks with a mobo that can also run them at those speeds.


The sticks are guaranteed, but not the motherboard or the memory controller. Many Skylake/Kaby Lake CPUs struggle to remain over 3000 MHz over time.


----------



## TheLostSwede (May 10, 2019)

Aerpoweron said:


> My apologies, misunderstanding on my side.
> 
> Back to the discussion then. Are there official memory supported speeds for the Ryzen 3000 series? As far as i have seen AMD was always a little ahead of Intel concerning officially supported memory speeds. My guess would be DDR4 3200 support since the Ryzen 2000s have 2933 support.



I'm sure there will be, I don't know that number yet. However, it will work with much faster memory than that this time around.
Apparently AMD is kind of crap when it comes to memory validation, they mostly leave that up to the motherboard makers. 
So there's at least a detail that those of you considering getting Ryzen 3000 should consider, get memory that's on QVL, as with current boards, as those modules should work the best, anything else is down to luck.


----------



## bug (May 10, 2019)

TheLostSwede said:


> Some days it feels like it.
> 
> Oh, I have, trust me. Look at when I joined here though. I've known the owner of this site since before it started. In fact, I told him I thought the name was a bit pants...
> 
> ...


So you're saying I could get something noticeably better than my 6600k for $300 or less?


----------



## TheLostSwede (May 10, 2019)

efikkan said:


> I'm going to give you a last chance to read my posts and apologize for your adolescent behavior. Even if I were wrong, there is no excuse for behaving like this, and if I were, any grown up should be able to make a serious argument for their case.
> 
> We're not going to descend into a philosophical discussion about _truth_ really is here, but any reasonable person understands the distinction between speculation and fact. Just like no-one can know the outcome of a soccer match before it's played, but do a qualified guess, and even be right, it still doesn't change the fact that the result was not known in advance.
> 
> ...



I should apologise to you? 
You're the one going around doing the name calling. You're the one making a fool of yourself. You've already decided on your point of view and you have no intention of changing that. So what happens when the products launch and it turns out that you're wrong? Are you going to apologise to everyone you've called a liar then?

You just keep going back to the same bad logic that you have decided in your head is the only way things can be.
Once again, Ryzen 3000 was supposed to launch sometime after CES, say February/March, but was delayed due to the X570 chipset and motherboards not being ready for prime time. So obviously AMD would have already worked out all the CPU details by December and was ready to make an initial announcement at CES and tell their customers about pricing, etc. at the show. This seems to be an impossible concept for you to grasp, why?

Until you can prove that his information is false, you can't say that. I'm trying to tell you that he's within spitting distance. No more than 5-10% here or there. Obviously you don't care about someone trying to confirm things either, as who am I? How can I possibly know anything about this, I'm just some dude on a forum.



bug said:


> So you're saying I could get something noticeably better than my 6600k for $300 or less?



Depends on how many cores you want...


----------



## Aerpoweron (May 10, 2019)

bug said:


> So you're saying I could get something noticeably better than my 6600k for $300 or less?



Define what "better" means for you. 

For pure gaming i don't think you will notice anything, especially with a 60Hz monitor. But if you stream and play i guess there will be a reasonably priced 6 or 8 core Ryzen 3000 for you


----------



## eidairaman1 (May 10, 2019)

medi01 said:


> *How to undermine a product 101: overhype it, to get dissapponted later.*
> 
> 16 cores is a given, 5Ghz not at all.
> 
> ...



43 idle, 55 gaming, worst case power virus 75



Aerpoweron said:


> Define what "better" means for you.
> 
> For pure gaming i don't think you will notice anything, especially with a 60Hz monitor. But if you stream and play i guess there will be a reasonably priced 6 or 8 core Ryzen 3000 for you



Considering Ryzen 1000 and 2000 did it just fine


----------



## medi01 (May 10, 2019)

eidairaman1 said:


> 43 idle, 55 gaming, worst case power virus 75


That's not the point (and it tells very little about actual performance). It was built for high clocks (aiming at even higher clocks) *to make up for IPC lower than previous gen*.


----------



## storm-chaser (May 10, 2019)

efikkan said:


> Until the final specs are actually set, AMD, Nvidia and Intel operate with targets, and these are ranges, not specific clock speeds and exact prices, since this depends on the final yields. This is set after the final stepping has gone through qualification. There are many leaks that can contain true information, like bigger architectural changes etc., but no-one can know the final clocks/price etc. until they are set, and you can use this knowledge to discredit many false "leaks". False leaks will accelerate disappointment when they eventually come crushing down.


Are you trying to make the claim that AMD doesn't even know what the final clocks will be? Like AMD is there rolling the dice crossing fingers and just hoping for 5.0Ghz... praying for 5.0Ghz... And I'm sure precision boost operates in much the same way? Luck of the draw with qualification, "final stepping" and all? Since AMD is operating in "ranges" with no set goals, lol. AMD might as well open casinos instead of chip fab plants with this kind of logic.


----------



## Shatun_Bear (May 10, 2019)

medi01 said:


> Come on.
> His videos, on top of being free, are at around 50k-ish views typically, I doubt one could get non-negligable income from that.



I meant through Patreon, which is what 'supports' him to make the videos. He gets $2300 a month, which is a decent wage in itself. So whatever you say, he has to be held to some standards as people are paying for this stuff.


----------



## eidairaman1 (May 10, 2019)

Time to wait till the parts arrive.


----------



## Shatun_Bear (May 10, 2019)

storm-chaser said:


> Are you trying to make the claim that AMD doesn't even know what the final clocks will be? Like AMD is there rolling the dice crossing fingers and just hoping for 5.0Ghz... praying for 5.0Ghz... And I'm sure precision boost operates in much the same way? Luck of the draw with qualification, "final stepping" and all? Since AMD is operating in "ranges" with no set goals, lol. AMD might as well open casinos instead of chip fab plants with this kind of logic.



It's a new process node, their first CPU on 7nm, of course they absolutely do not know what final clocks will be until relatively close to launch. They will have targets and ranges of course, but like the guy above stated, that's not what the Adored charlatan claimed in his vids, hence the hype and excitement he created. I mean he could have claimed 'AMD are targetting 4.5-4.8Ghz boost' back in Dec last year, which would have been believable instead of exact clocks and prices 8 months before release.


----------



## storm-chaser (May 10, 2019)

Sure I get the argument it's like GM can produce the Corvette ZR1 but until it's track tested, the top speed remains unknown. Unless they pre program a limiter into the ECU.

Sure, binning takes place near the end of production, but targets are there long in advance.


----------



## Jism (May 10, 2019)

storm-chaser said:


> Are you trying to make the claim that AMD doesn't even know what the final clocks will be? Like AMD is there rolling the dice crossing fingers and just hoping for 5.0Ghz... praying for 5.0Ghz... And I'm sure precision boost operates in much the same way? Luck of the draw with qualification, "final stepping" and all? Since AMD is operating in "ranges" with no set goals, lol. AMD might as well open casinos instead of chip fab plants with this kind of logic.



Before they actualise the chips design onto a wafer they do have certain expectations and TSMC proberly will show them what ranges to expect. It's not like they are doing a gamble like this before ramping up a design to be converted into a working CPU.

The reason why the 2700x and such never passed 4.4GHz (if you where lucky) is purely due to the silicon's limitation. They have opted for a power efficient chip and thats what they got. They knew upon the 2nd revision what to aim for and how to get the clocks up. Add some sauce of efficiency on it and you can pack alot more cores then the first generation without exceeding the power enveloppe.

This looks very promissing, as my 2700x is not even at my system for - 3 months or so, and already releasing a big tank with 16 cores and 32 threads. Right now i'm very saturated


----------



## EarthDog (May 10, 2019)

I think many people would be surprised at how much an initial goal changes once the chips are cut and yields figured out with all their parameters for each SKU in place. While it isn't night and day, its still a bit premature, unless you are in the know, to guess at it. That said, by now, surely they have an idea of where the plinko chip falls.... most of us however, have no idea where exactly that will be.


----------



## storm-chaser (May 10, 2019)

In theory, as the manufacturing process is improved with every nm jump, we should see less and less of a need for binning as time goes forward. Is that correct?

Put it another way, will we get to a point where manufacturing process is so good that binning will become a thing of the past?


----------



## efikkan (May 10, 2019)

storm-chaser said:


> In theory, as the manufacturing process is improved with every nm jump, we should see less and less of a need for binning as time goes forward. Is that correct?
> 
> Put it another way, will we get to a point where manufacturing process is so good that binning will become a thing of the past?


No, quite to the contrary. With smaller and more advanced nodes leakage and small defects become a larger problem.
Binning is very much needed, as the quality of the chips are not consistent.

Also, each node jump have changes in materials, sizes of gates, wires etc. This needs more tweaking and calibration for the next nodes.


----------



## Darmok N Jalad (May 11, 2019)

efikkan said:


> So just saying; manage your expectations.


It’s not about my expectations. AMD needs 5.0GHz+ or much better IPC or they won’t be keeping up with Intel. I still think it’s totally possible and that first edition Zen is limited architecturally to current clocks. I think the latest snippet that we’ve seen about Zen 2 managing high-speed memory differently might be a clue (if true). Looks like IF has been holding back faster DRAM, so it’s quite possible IF was failing the CCX when it was clocked higher. There’s a reason they went to chiplets, and I think it was to solve clock scaling. What AMD actually does doesn’t rally matter that much to me, I’m just having fun speculating about the engineering.


----------



## TheMadDutchDude (May 11, 2019)

I am getting more and more itchy to grab a new CPU and board...

I have some A2 B-die here that should be good for 4500+ MHz with relative ease; so I'll be looking to see what it can do.


----------



## Frick (May 11, 2019)

EarthDog said:


> It's comical, the true fanboys don't get called out, yet, people that have at least a half a clue (in this case more, a full clue) are called fanboys. TPU members are AMAZING!



It just shows TPu is just regular and to be honest pretty low level computer dudes (with a bunch of exceptions of course), and because they're avarage dudes from the streets fact and opinion are interchangeable and "knowledge" is an almost ethereal term and it is distrusted. It's not a TPU problem. "I reject your reality and substitute my own" is a fun catchphrase, but I doubt it was ever meant to be a moral philosophy you actually should adhere to.

And technically everything about Zen 2 is rumor, up to the point of reviews hitting the homes of avarage computer dudes, the unwashed streets of Soho (or whereever gangs of computer dudes roam).



Anywhoo. Me I'm mostly interested in the sub €200 market. I've started the process of ripping my DVD's and my little Haswell i3 is definitely showing its age. High end parts are fun and all but .. eh.


----------



## Shatun_Bear (May 11, 2019)

TheLostSwede said:


> Once again, Ryzen 3000 was supposed to launch sometime after CES, say February/March, but was delayed due to the X570 chipset and motherboards not being ready for prime time. So obviously AMD would have already worked out all the CPU details by December and was ready to make an initial announcement at CES and tell their customers about pricing, etc. at the show. This seems to be an impossible concept for you to grasp, why?



Hold on, so are you claiming that Ryzen 3000 CPUs were ready to hit retail in February (or March), the month when Lisa Su demonstrated an engineering sample at CES and claimed they are still working on increasing performance? Because this sounds like total BS as well.

I mean it is very simple. If word from the horses mouth is 'we're still in engineering sample stage' then that is a timeline you'd expect for a launch several months later. Yet here we have a random poster on the internet claiming chips were actually ready to hit retail whilst Su was standing on stage indicating the opposite?! Because you claim the launch was only delayed by the mobos.

Please tell me I have read your claims wrong.


----------



## TheLostSwede (May 11, 2019)

Shatun_Bear said:


> Hold on, so are you claiming that Ryzen 3000 CPUs were ready to hit retail in February (or March), the month when Lisa Su demonstrated an engineering sample at CES and claimed they are still working on increasing performance? Because this sounds like total BS as well.
> 
> I mean it is very simple. If word from the horses mouth is 'we're still in engineering sample stage' then that is a timeline you'd expect for a launch several months later. Yet here we have a random poster on the internet claiming chips were actually ready to hit retail whilst Su was standing on stage indicating the opposite?! Because you claim the launch was only delayed by the mobos.
> 
> Please tell me I have read your claims wrong.



No, that's not what I'm claiming. What I'm saying is, that was the original plan. However, something clearly went pear shaped and at least one of the issues was the lack of chipset. AMD actually considered launching the CPUs without a new chipset.

It's indeed very simple, if in December they'd set out the planned SKU and pricing, but had a change of heart, they could easily change the story at CES. It's not hard to do, simply go up on stage and say, hey, here's an engineer sample, rather than saying hey, here's our new chip which will launch in a couple of months. That's not very hard to do.

I didn't claim it was only the motherboards, that's the part of the story I know. If there's more to it, then it's information I don't have.

You really like to twist and turn things, no? But whatever, you're not in the industry, so you don't know what's been going, nor what's going on.



Frick said:


> And technically everything about Zen 2 is rumor, up to the point of reviews hitting the homes of avarage computer dudes, the unwashed streets of Soho (or whereever gangs of computer dudes roam).



It used to be Tottenham Court Road, Soho was a bit too scary for the computer dudes, but actually just a stones throw away...
Used to work in that neighbourhood, twice in fact. Sadly there aren't much left of interest, as most of the electrics and computer shops are closed. I didn't work in a shop though.


----------



## Shatun_Bear (May 11, 2019)

TheLostSwede said:


> No, that's not what I'm claiming. What I'm saying is, that was the original plan. However, something clearly went pear shaped and at least one of the issues was the lack of chipset. AMD actually considered launching the CPUs without a new chipset.
> 
> It's indeed very simple, if in December they'd set out the planned SKU and pricing, but had a change of heart, they could easily change the story at CES. It's not hard to do, simply go up on stage and say, hey, here's an engineer sample, rather than saying hey, here's our new chip which will launch in a couple of months. That's not very hard to do.
> 
> ...



So now you're changing your story? I think this is all a big wind up to you.

You first claimed Ryzen 3000 was intended to launch in Feb/March but AMD held back because of mobo chipset delay. Then it was pointed out to you Lisa Su could only show an early engineering sample in Feb, your window for the intended launch. So any launch then, intended or otherwise, was impossible.

So now you're saying actually something went 'pear shaped' with the CPUs themselves to scupper these plans.

So in this world of fiction of yours, AMD had set out every Ryzen CPU and APU in December with all the base/boost clocks and prices for the complete line-up, which explains your undying belief in AdoredTV's fake chart, as these CPUs were ready to release in Feb/March. But sometime between Dec and Feb, these plans all went pear shaped shaped so Lisa Su was asked to stand up on stage at CES and pretend that the CPUs are all in eng. sample stage still, and will launch 'summer 2019' now instead.

I mean just read that back to yourself and ponder whether it sounds plausible considering there were little or no Zen 2 engineering samples before Dec 2018. If launch was planned for Feb/March we would have seen several engineering samples in September, October or earlier.


----------



## TheLostSwede (May 11, 2019)

Shatun_Bear said:


> So now you're changing your story? I think this is all a big wind up to you.
> 
> You first claimed Ryzen 3000 was intended to launch in Feb/March but AMD held back because of mobo chipset delay. Then it was pointed out to you Lisa Su could only show an early engineering sample in Feb, your window for the intended launch. So any launch then, intended or otherwise, was impossible.
> 
> ...



Huh, I didn't change anything, you're clearly reading things whichever way you want.

Lisa Su showed the sample in early January, CES is usually in the first or second week of January, not February. So who's making crap up now?
Again, read what I wrote, it's easy to call anything an engineering sample, if you have never shown i to anyone outside the company before, no?

I never said something went pear shaped with the CPUs, again, your interpretation. I said "something clearly went pear shaped and at least one of the issues was the lack of chipset" which if you could read, implies that there might have been other issues than the lack of the chipset, but I don't know if this is/was the case or not.

I never mentioned the APUs, I have no knowledge about them. Please stop trying to add things in to the discussion that I have not talked about.
I also never said I have undying belief in AdoredTV's charts. I had information about Ryzen 3000 before he posted his chart. I can't share that information here though, since that might cause issues for people I'm friends with and I don't do that to friends. I have a file on my desktop from the 4th of October 2018 with full chipset specs for X570. I can share that once we pass Computex if you'd like.
I have no reason to rely on on AdoredTV, I work in the tech industry and have done for most of my life, I have my own connections and sources.

I don't know what you do for a living, but tell me something, plans never change in your job? I don't know how many times plans have changed in my various jobs that I've had since I started working. Products are delayed last minute, product specs change last minute, etc. Shit, I worked for a company that launched a product on Kickstarter, two months before the planned shipping date and with a finished product, we scrapped it, as it was crap and started over from scratch and delayed everything six months, although it ended up being a year in the end due to various reason. So yes, I do believe AMD did exactly this. Something changed in their plans and they changed the presentation at CES accordingly.

Considering how late the motherboard makers got CPU samples for Ryzen 1000, yes, it's very plausible. In fact, they only got their final ES samples this past week. AMD did a tour of the board makers in Taiwan this week and handed out samples and did some testing with the board makers. Can I prove this to you? No. But I know it happened. You can believe it or not. I'm guessing part of the reason for that is that they didn't want too many details to leak before the launch. Has the board makers had CPUs before that? Yes, they've had chips since last year, just not at the correct clock speeds. It's a great way to prevent leaks. Taiwan and China and leaky sieves when it comes to these things and let's assume the AdoredTV charts are right and AMD has a corker of a product, would they want that to leak ahead of the launch? Most likely not, as they want it to be a big surprise. I know for a fact that much of the rumours are true, but go on, feel free to doubt it, but I do hope you'll come back here an apologise in public once the products launch, if it turns out you were wrong.


----------



## efikkan (May 11, 2019)

Whenever there is a launch of a new platform like this, there are months of internal testing(1), then early engineering samples to third parties (mostly motherboard makers, BIOS developers, etc.)(2), and then after the (assumed) final stepping is ready, a large scale testing effort from many partners(3), including OEMs, game developers, important enterprise customers(if applicable) etc. These are usually identical with the final product in silicon, clocks are usually close or identical.

I don't recall what the early roadmaps said about the launch of Zen 2, but if in fact it was delayed it happened way before December. By the signs the public have seen of the lacking ramp-up of Zen 2, it couldn't have launched in February, even if the node was ready. The ramp-up seems to have started around February and we are now in (3), which should indicate we are soon approaching the launch window.

I think the rumors about a launch at CES was a mix of wishful thinking and too much reading between the lines. It was pretty clear from that Lisa said that they were far away from launch, and the reason why she didn't want to commit to a date is because she knew there was still some uncertainty.

And BTW; chipset specs shouldn't change after tapeout, which happens ~1 year ahead of launch.


----------



## TheLostSwede (May 11, 2019)

efikkan said:


> Whenever there is a launch of a new platform like this, there are months of internal testing(1), then early engineering samples to third parties (mostly motherboard makers, BIOS developers, etc.)(2), and then after the (assumed) final stepping is ready, a large scale testing effort from many partners(3), including OEMs, game developers, important enterprise customers(if applicable) etc. These are usually identical with the final product in silicon, clocks are usually close or identical.
> 
> I don't recall what the early roadmaps said about the launch of Zen 2, but if in fact it was delayed it happened way before December. By the signs the public have seen of the lacking ramp-up of Zen 2, it couldn't have launched in February, even if the node was ready. The ramp-up seems to have started around February and we are now in (3), which should indicate we are soon approaching the launch window.
> 
> ...



Let me be like you two, show me proof. Can you verify any of the stuff you just wrote? Links? Sources?

You two are speculating a lot, which you're free to do, but at the same time, you then need to accept that others have the same right.

I know for a fact that certain things have happened, I don't know why and I don't have all the details. I know what to expect from Ryzen 3000 in terms of clock speeds and even performance to a degree. I have seen hardware. What I can't do, is share exact details of that here, since I've given my word not to do so and it's something I stand by. If you chose to believe me or not, is up to you. We only have a few weeks to go, but I hope you two are at least big enough as people that you can admit that you have called people names for the wrong reasons once the product launch.

You're also following the Intel way when it comes to how you explain how things are done and yes, this is how they do things, I know that very well. However, AMD doesn't do things in the same logical way and the motherboard makers have to take a much larger share of the workload when it comes to develop the platforms compared to when they work with Intel. So don't bet on your believes being the only way things are done.


----------



## Shatun_Bear (May 11, 2019)

TheLostSwede said:


> Huh, I didn't change anything, you're clearly reading things whichever way you want.
> 
> Lisa Su showed the sample in early January, CES is usually in the first or second week of January, not February. So who's making crap up now?
> Again, read what I wrote, it's easy to call anything an engineering sample, if you have never shown i to anyone outside the company before, no?
> ...



January or February, the CPUs were not even close to releasing it makes little difference.

I mentioned APUs as AdoredTv included every Ryzen 3000 series APU in his chart back in December alongside the CPUs, which is rather laughable on its own as these may launch in 2020 as they're getting refreshed on 12nm this year.

And in terms of plans changing, sure, but there's a difference between Kickstarter and a billion dollar company.

And my only contention is that AdoredTV's numbers are guesses or fake, as I keep repeating, 4.3Ghz base clock on a 16-core CPU is not happening, nor is 4.2Ghz base clock on a 12-core (his numbers). I will not be making a public apology regarding this as no SKU will have these base clock frequencies.

I'm anticipating boost clocks between 4.6-4.8Ghz, and if AdoredTV's figures are 'close' to these, that's no vindication at all of his chart as anyone can make an educated guess. What wasn't educated is 4.3Ghz and 4.2Ghz base clocks, which is the giveaway the numbers are fake.


----------



## efikkan (May 11, 2019)

TheLostSwede said:


> Let me be like you two, show me proof. Can you verify any of the stuff you just wrote? Links? Sources?


You know how evidence works, right?
If you claim that Zen 2 was to release in February/March, something that runs contrary to the established knowledge/baseline, then you're the one who have to prove something.

Proving a negative is hard/impossible, but as I've pointed out there is a lack of telltale signs of an impending release, signs that we have seen popping up afterwards and even you have acknowledged, evidence pointing to a release in the "middle" of the year, just like Lisa said during CES, so I don't have to prove anything.



TheLostSwede said:


> You two are speculating a lot, which you're free to do, but at the same time, you then need to accept that others have the same right.


As I've said several times in this thread already, there is nothing wrong in speculating.
But there are not really any speculation in the post you quoted.

Why do you keep pretending that I say things that I don't? Calm down. If your case is strong, you should be able to make your case without insulting people.



TheLostSwede said:


> I know for a fact that certain things have happened, I don't know why and I don't have all the details…


There are probably hundreds or more who have some degree of access to NDA'ed information or engineering samples at this point, and I don't doubt that, but that doesn't mean you have access to every detail. And there are certain information which are not available until close to release, even for AMD's own engineering team, simply because the details are not ready yet.



TheLostSwede said:


> …but I hope you two are at least big enough as people that you can admit that you have called people names for the wrong reasons once the product launch.


I have stated the fact that people can't know a fact before it exists, and those who claim to are lying, that is not name calling, that's pretty much what the definitions of what facts and speculation are. And this has nothing to do with AMD, this is the same for all of the makers, and have nothing to do with how good Zen 2 turns out to be. Speculation is speculation no matter how accurate it turns out to be.


----------



## TheLostSwede (May 11, 2019)

Shatun_Bear said:


> January or February, the CPUs were not even close to releasing it makes little difference.
> 
> I mentioned APUs as AdoredTv included every Ryzen 3000 series APU in his chart back in December alongside the CPUs, which is rather laughable on its own as these may launch in 2020 as they're getting refreshed on 12nm this year.
> 
> ...



How do you know this? Where's the proof? You question my knowledge, so I'll question yours in the same way. You have no more proof than I do, in fact, you have less.

Am I Jim? No. Is Jim my source? No.

Yes and No. You clearly have never developed a product.

Well, you believe whatever you want to believe. I know a few of the clocks, but I've promised not to share the information, but let's just say it can be done. I really do expect you to apologise to both me and Jim when the time comes. You're incredibly stubborn for someone that has zero information about the topic you're discussing.



efikkan said:


> You know how evidence works, right?
> If you claim that Zen 2 was to release in February/March, something that runs contrary to the established knowledge/baseline, then you're the one who have to prove something.
> 
> Proving a negative is hard/impossible, but as I've pointed out there is a lack of telltale signs of an impending release, signs that we have seen popping up afterwards and even you have acknowledged, evidence pointing to a release in the "middle" of the year, just like Lisa said during CES, so I don't have to prove anything.
> ...



I have nothing more or less to prove than you do. I'm not the one arguing that the leaks are a lie.

I don't have to speculate, I know facts. I never said I have all the facts. 

Dude, no offence, but I'm going to block you, as your logic is so flawed I can't deal with you any more.


----------



## HenrySomeone (May 11, 2019)

Zareek said:


> Agreed but the leaks say the R9 3850X 16c/32t @ 4.3Ghz base 5.1Ghz boost will have the highest frequency.  That is followed by R7 3700X 12c/24t @ 4.2Ghz base 5Ghz boost. Personally, I'd like an 8 core with the best clocks they can pull off! The leaks on IPC are all over the place claiming anywhere from 10-29% depending on task. All of them seem to agree AMD will still lag in gaming.


I am almost a 100% that even the best that Zen2 will muster, will still be over 10% slower than 8700k/9700k/9900k in gaming, in some cases probably more like 20, maybe even 25 (those are the cases where current Zen+ chips lag 40-45% )


----------



## R0H1T (May 11, 2019)

Is that stock 9900k vs 2700x & which games? This *up to 45% margin* sounds bogus to me.


----------



## HenrySomeone (May 11, 2019)




----------



## bug (May 11, 2019)

@HenrySomeone I have this strange feeling someone buying a high-end CPU is unlikely to be stuck playing at FHD.


----------



## storm-chaser (May 11, 2019)

I think one thing we can all agree on is we live in fascinating times. Lets just enjoy the lead out and hype, if that's what you want to call it.

Nobody should be having hurt feelings here. Tech discussion should be fun, first and foremost.


----------



## bug (May 11, 2019)

storm-chaser said:


> I think one thing we can all agree on is we live in fascinating times. Lets just enjoy the lead out and hype, if that's what you want to call it.
> 
> Nobody should be having hurt feelings here. Tech discussion should be fun, first and foremost.


Not that fascinating, depending on how long you've been tracking things, but a shake-up we haven't seen since Intel unveiled the Core architecture.


----------



## HenrySomeone (May 11, 2019)

bug said:


> @HenrySomeone I have this strange feeling someone buying a high-end CPU is unlikely to be stuck playing at FHD.


Irrelevant, he wanted to see at least one case of top Intel chips being up to 45% faster and I provided and besides, high refresh rate 1080p gaming actually requires the most cpu power anyway, since there's obviously less gpu bottlenecking and also, today's 1080p difference is tomorrow's 1440p


bug said:


> Not that fascinating, depending on how long you've been tracking things, but a shake-up we haven't seen* since Intel unveiled the Core architecture*.


Not even close...


----------



## R0H1T (May 11, 2019)

So OCed 9900k (5.2GHz) vs 2700 (non X) & the biggest difference is in min FPS, that's like cherry picking a particular strain of cherry from a remote country. Do you wanna see a benchmark where AMD destroys Intel by an even higher margin? I think gaming might be difficult but for applications I can give you examples.


HenrySomeone said:


> Irrelevant, he wanted to see at least one case of top Intel chips being up to 45% faster and I provided and besides


No I asked if it was stock vs stock, which this obviously is not!


----------



## efikkan (May 11, 2019)

storm-chaser said:


> I think one thing we can all agree on is we live in fascinating times. Lets just enjoy the lead out and hype, if that's what you want to call it.
> 
> Nobody should be having hurt feelings here. Tech discussion should be fun, first and foremost.


Indeed, in a few months AMD's competitiveness should at its greatest in over 10 years(for CPUs), and there should be potential for some great deals this fall. This is when the real fun begins.

That doesn't mean we should spread misinformation though.


----------



## HenrySomeone (May 11, 2019)

R0H1T said:


> So OCed 9900k (5.2GHz) vs 2700 (non X) & the biggest difference is in min FPS, that's like cherry picking a particular strain of cherry from a remote country. Do you wanna see a benchmark where AMD destroys Intel by an even higher margin? I think gaming might be difficult but for applications I can give you examples.
> No I asked if it was stock vs stock, which this obviously is not!


Their 2700 at the time OCed slightly better than their 2700x, Steve explained that clearly, yes with a golden sample you can get a 4,3Ghz all-core OC, but it hardly matters and besides you have your 2700*X *in the last graph and stock vs stock 9900k is just under 40% faster for average frames and, much more importantly, over 50% faster for 0,1%  I mean, Ryzen can't even hold over 60 fps at all times which i find just embarrassing and all of those are mainstream games, nothing particularly cherry-picked about that. Obviously not all games run that much better on Intel, which I've never claimed, but they exist and not just one or two. Oh and please show me that AMD winning benchmark, but it has to be at thread parity, otherwise it will just be more of that better value PR


----------



## R0H1T (May 11, 2019)

You said* up to 45% slower*, so even if we discount the massive OCing handicap ~* I'm sure you can do the math* as the 2700 (non x) is not up to 45% slower in any of those results.
How about you do a re-take of that one?


HenrySomeone said:


> Oh and please show me that AMD winning benchmark, but it has to be at thread parity, otherwise it will just be more of that better value PR


Check the previous flagship MSDT 7700k vs the lowly 2400G


----------



## storm-chaser (May 11, 2019)

HenrySomeone said:


> Their 2700 at the time OCed slightly better than their 2700x, Steve explained that clearly, yes with a golden sample you can get a 4,3Ghz all-core OC, but it hardly matters and besides you have your 2700*X *in the last graph and stock vs stock 9900k is just under 40% faster for average frames and, much more importantly, over 50% faster for 0,1%  I mean, Ryzen can't even hold over 60 fps at all times which i find just embarrassing and all of those are mainstream games, nothing particularly cherry-picked about that. Obviously not all games run that much better on Intel, which I've never claimed, but they exist and not just one or two. Oh and please show me that AMD winning benchmark, but it has to be at thread parity, otherwise it will just be more of that better value PR



Ignore this guy. Anti AMD troll


----------



## NdMk2o1o (May 12, 2019)

HenrySomeone said:


> Irrelevant, he wanted to see at least one case of top Intel chips being up to 45% faster and I provided and besides, high refresh rate 1080p gaming actually requires the most cpu power anyway, since there's obviously less gpu bottlenecking and also, today's 1080p difference is tomorrow's 1440p
> 
> Not even close...


AMD trolling 101: Cherry pick 1 result with a unusually highly CPU dependent game that is by no means the norm then cherry pick numbers out of your butt... Got it, thanks


----------



## bug (May 12, 2019)

R0H1T said:


> So OCed 9900k (5.2GHz) vs 2700 (non X) & the biggest difference is in min FPS, that's like cherry picking a particular strain of cherry from a remote country. Do you wanna see a benchmark where AMD destroys Intel by an even higher margin? I think gaming might be difficult but for applications I can give you examples.
> No I asked if it was stock vs stock, which this obviously is not!


Tbh min frame rate is what will kill your gaming experience. But it's still cherry-picking and as I already pointed out, you're not going to buy a high-end CPU to game at FHD. At QHD and UHD the CPU is no longer a bottleneck anyway.
There are still scenarios where faster cores are needed more than additional cores, but those are the only saving grace left for the current generation of Intel CPUs. If Zen2 closes that gap, Intel is going to need Ice Lake like ultra-fast.


----------



## EarthDog (May 12, 2019)

bug said:


> There are still scenarios where faster cores are needed more than additional cores, but those are the only saving grace left for the current generation of Intel CPUs.


That is most scenarios. There isnt anything a mainstream Intel (or amd) CPU cant do for 95% of people. 16c on mainstream is ridiculous today.


----------



## TheLostSwede (May 12, 2019)

So here's an interesting snippet, Ryzen 3000 is at revision B0 for the new ES samples, I wonder why that is? https://www.tomshardware.com/news/ryzen-3000-bios-stepping-amd,39319.html


----------



## NdMk2o1o (May 12, 2019)

EarthDog said:


> That is most scenarios. There isnt anything a mainstream Intel (or amd) CPU cant do for 95% of people. 16c on mainstream is ridiculous today.


No its not, if you don't need 16c don't buy them, I'm sure many enthusiasts and previous gen hedt users will lap them up, I honestly can't see what the issue is, so it would be OK on a different platform but because its on am4's mainstream platform its ridiculous. That makes no sense. And of course x570 has more pcie lanes and pcie4 so even more of a reason to have a higher core count processor imo the sad truth is Intel has hindered technology advancement for the last 10 years with quad cores, consoles will have and make use of 8 cores which will now start become the norm because of that and pc gamers have always been ahead of the curve when it comes to consoles.


----------



## bug (May 12, 2019)

EarthDog said:


> That is most scenarios. There isnt anything a mainstream Intel (or amd) CPU cant do for 95% of people. 16c on mainstream is ridiculous today.


Amen to that.


----------



## EarthDog (May 12, 2019)

NdMk2o1o said:


> No its not, if you don't need 16c don't buy them, I'm sure many enthusiasts and previous gen hedt users will lap them up, I honestly can't see what the issue is, so it would be OK on a different platform but because its on am4's mainstream platform its ridiculous. That makes no sense. And of course x570 has more pcie lanes and pcie4 so even more of a reason to have a higher core count processor imo the sad truth is Intel has hindered technology advancement for the last 10 years with quad cores, consoles will have and make use of 8 cores which will now start become the norm because of that and pc gamers have always been ahead of the curve when it comes to consoles.


LMK when software catches up... weve been waiting for 8 years.

Yes..it's ok on HEDT... it's a workstation platform. A distinct segment from the rest. More cores on mainstream help fewer people directly.


----------



## bug (May 12, 2019)

EarthDog said:


> LMK when software catches up... weve been waiting for 8 years.


Yeah, the age-old misunderstanding. Software already uses 2-3 orders of magnitude more threads than there are physical cores in a system. The thing is, there's just not enough load to be distributed between those threads to saturate 4 cores most of the time, let alone 16.
Core count is becoming the new MHz race. And I don't have anything against building what is essentially a better CPU overall, it's just that many people are wasting money by not corroborating what a CPU can do with their actual needs.


----------



## EarthDog (May 12, 2019)

bug said:


> Core count is becoming the new MHz race. And I don't have anything against building what is essentially a better CPU overall, it's just that many people are wasting money by not corroborating what a CPU can do with their actual needs.


Exactly. I dont understand the hard on most people have for more cores when they cant use it. 

That said, I understand innovation and moving forward. I do appreciate the increased IPC and clocks as well as the imminent price drops we are likely to see if these perform well enough. 

The real winners here, and I've said this before, are the cheap quad/hex/octo w/SMT. Much more than that, few need to care.


----------



## bug (May 12, 2019)

EarthDog said:


> Exactly. I dont understand the hard on most people have for more cores when they cant use it.


The hard on is about getting more cores than you can use and then complaining developers are lazy and don't want to put your hardware to good use


----------



## efikkan (May 12, 2019)

It's a common misunderstanding that multicore scaling is primarily a lack of good software. I explained this some more here.
TLDR; most real-world tasks can't scale across an arbitrary number of cores, so unless you're running more tasks or you're running more typical servers, more and more cores is only going to give you diminishing returns, and even lower performance if you at some point have to sacrifice core performance for more cores.

Single core performance is essential and will become only more important in the next years, even for those processes which uses many threads, due to the synchronization overhead. But the clockspeed race seems to be nearly over, so future gains will come from IPC increases.


----------



## bug (May 12, 2019)

efikkan said:


> It's a common misunderstanding that multicore scaling is primarily a lack of good software. I explained this some more here.
> TLDR; most real-world tasks can't scale across an arbitrary number of cores, so unless you're running more tasks or you're running more typical servers, more and more cores is only going to give you diminishing returns, and even lower performance if you at some point have to sacrifice core performance for more cores.
> 
> Single core performance is essential and will become only more important in the next years, even for those processes which uses many threads, due to the synchronization overhead. But the clockspeed race seems to be nearly over, so future gains will come from IPC increases.


Unfortunately, understanding all that requires programming knowledge. Most people don't have that.
At some point I crossed paths with a guy with several years of programming experience, rather well regarded within his team. When tasked with something that required a mild amount of concurrency, he said "I'm going to need some time to get familiar with this threading thing". So if a guy programming for a living can do that, good luck explaining cores and threads to the layman.


----------



## efikkan (May 12, 2019)

bug said:


> Unfortunately, understanding all that requires programming knowledge. Most people don't have that.
> 
> At some point I crossed paths with a guy with several years of programming experience, rather well regarded within his team. When tasked with something that required a mild amount of concurrency, he said "I'm going to need some time to get familiar with this threading thing". So if a guy programming for a living can do that, good luck explaining cores and threads to the layman.


Sure. My main point is that it's not just lack of willingness to adapt multithreading, hard problems are actually hard to solve.

Your story doesn't surprise me at all. Of all the programmers I've dealt with over more than a decade and a half, probably less than 5% is at that level of competence to deal with problems this complex. Even a typical programmer with 10 years of experience wouldn't even fully grasp the problem, even if explained in detail. The wast majority of programmers don't touch anything this low level, like web developers, app developers and most writing enterprise software in Java or C#, so they never develop an understanding of how it works.


----------



## bug (May 12, 2019)

efikkan said:


> Sure. My main point is that it's not just lack of willingness to adapt multithreading, hard problems are actually hard to solve.
> 
> Your story doesn't surprise me at all. Of all the programmers I've dealt with over more than a decade and a half, probably less than 5% is at that level of competence to deal with problems this complex. Even a typical programmer with 10 years of experience wouldn't even fully grasp the problem, even if explained in detail. The wast majority of programmers don't touch anything this low level, like web developers, app developers and most writing enterprise software in Java or C#, so they never develop an understanding of how it works.


Well, if you think about it, higher level languages actually trivialize multi-threading (think Java's executors, Erlang's spawn or Go's goroutines). The fact that even with this help programs don't fully load cores is further proof things we routinely do don't really need that many cores.
On the other hand, if we, programmers, would bubble sort and brute force everything, the whiny bunch would actually be much happier. Their cores would suddenly be seeing 100% usage.


----------



## efikkan (May 12, 2019)

bug said:


> Well, if you think about it, higher level languages actually trivialize multi-threading (think Java's executors, Erlang's spawn or Go's goroutines). The fact that even with this help programs don't fully load cores is further proof things we routinely do don't really need that many cores.


This only really works if you're writing something that spawns independent worker threads, and each of them do a fairly large chunk of work(so the average overhead becomes small). This mostly applies to typical server workloads, and is hard to apply efficiently in normal desktop applications.
Some languages have ways to distribute functions across several worker threads, but it usually creates more synchronization overhead and problems than it solves.


----------



## EarthDog (May 12, 2019)

Details... 

Still waiting for software to manage...whatever that means on the programming side.


----------



## bug (May 12, 2019)

efikkan said:


> This only really works if you're writing something that spawns independent worker threads, and each of them do a fairly large chunk of work(so the average overhead becomes small). This mostly applies to typical server workloads, and is hard to apply efficiently in normal desktop applications.
> Some languages have ways to distribute functions across several worker threads, but it usually creates more synchronization overhead and problems than it solves.


Even so, you can have an architect or tech lead come up with the blue prints. My point was, from a programming point of view, threading has become pretty trivial. Using threading to speed up things, like you point out, is a whole other story - not everything will become faster because you spread the load over more threads. You can split the AI in a game and let it run amok, but unless that game happens to be chess, it still has to sync up with user input and whatnot.
Now let's try to get back on topic


----------



## NdMk2o1o (May 12, 2019)

So octo cores are OK now they're mainstream but anymore and it's a no no. Gotcha, let us know when it's OK by you when we can use more than 8 cores


----------



## bug (May 12, 2019)

NdMk2o1o said:


> So octo cores are OK now they're mainstream but anymore and it's a no no. Gotcha, let us know when it's OK by you when we can use more than 8 cores


Buying CPUs by the core disregarding your actual needs is a no no, but seemingly also beyond your comprehension ability.
Where the hell did you pull that 8 core limitation from anyway?


----------



## NdMk2o1o (May 12, 2019)

bug said:


> Buying CPUs by the core disregarding your actual needs is a no no, but seemingly also beyond your comprehension ability.
> Where the hell did you pull that 8 core limitation from anyway?


You could say the same about any high end computer component purchase, when you can buy something for a fraction that will still do the same job, or the guys with 32/64gb ram when 8/16 is sufficient for 90% of people. That's the key, 16 core CPU's won't be bought by the 90%, I won't buy one, I have 6c12t and that's plenty by me, still I don't try and tell other people what they should buy based on my use. I don't see what the big deal is, if you don't see a need for it then you won't buy it, simple and of course there are those with more money than sense who will buy it just because, that's no different to how its ever been, regardless of if its 8/16/32c or whatever



EarthDog said:


> The real winners here, and I've said this before, are the cheap quad/hex/octo w/SMT. Much more than that, few need to care.


And sorry I'm still catching up with the thread I was replying to this comment from the previous page, my bad


----------



## EarthDog (May 12, 2019)

NdMk2o1o said:


> So octo cores are OK now they're mainstream but anymore and it's a no no. Gotcha, let us know when it's OK by you when we can use more than 8 cores


Jesus, no forest through the trees with this guy. 

Edit: or just not finishing the thread before replying... as I just did seeing that comment. Haha!

As I've said before, what this does is setup the lemmings and those not in the know (95% of people) to think more cores are better. And to an extent, that is true. But for most users a 6c/12tcpu is plenty and doesnt bottleneck anyhing (and wont for years).. so hes, I'm annoyed the mainstream is packing in cores. At least with clockspeeds.. EVERYONE benefits. Cores... few do.


----------



## Aquinus (May 12, 2019)

Honestly, if this pans out to be true, I might consider going with this 16c chip. I'm already at the point of thinking about upgrading since X79, at this point, is a pretty dated platform and I wanted to replace it with a 2950X, but if I can get 16c/32t for 500 USD instead of 800, I'm all for it (forget the cost of a TR motherboard.) In reality I don't need all of that PCIe goodness and if dual-channel DDR4 can keep at least 12c/24t fully fed under memory intensive tasks, then I think I found my next upgrade.

I've waited this long, so let's just wait and see what happens.


----------



## TheLostSwede (May 12, 2019)

Aquinus said:


> Honestly, if this pans out to be true, I might consider going with this 16c chip. I'm already at the point of thinking about upgrading since X79, at this point, is a pretty dated platform and I wanted to replace it with a 2950X, but if I can get 16c/32t for 500 USD instead of 800, I'm all for it (forget the cost of a TR motherboard.) In reality I don't need all of that PCIe goodness and if dual-channel DDR4 can keep at least 12c/24t fully fed under memory intensive tasks, then I think I found my next upgrade.
> 
> I've waited this long, so let's just wait and see what happens.



Expect X570 boards to be pricey. This might be a reason why AMD is keeping the CPU prices on the low. Don't expect the exact pricing from AdoredTV's leak though, as prices have changed since then. Two weeks to go...


----------



## Aquinus (May 12, 2019)

TheLostSwede said:


> Expect X570 boards to be pricey. This might be a reason why AMD is keeping the CPU prices on the low. Don't expect the exact pricing from AdoredTV's leak though, as prices have changed since then. Two weeks to go...


Expensive, sure, but TR expensive? Probably not. We'll know soon enough.


----------



## Leaked (May 12, 2019)

TheLostSwede said:


> Expect X570 boards to be pricey. This might be a reason why AMD is keeping the CPU prices on the low. Don't expect the exact pricing from AdoredTV's leak though, as prices have changed since then. Two weeks to go...


Will the 16 Cores released later or with the other CPUs?


----------



## lsevald (May 12, 2019)

What is driving the price of x570 boards up so much? I guess we can use previous generation boards if its that bad?


----------



## GoldenX (May 12, 2019)

lsevald said:


> What is driving the price of x570 boards up so much? I guess we can previous generation boards if its that bad?


Wait for the BIOS updates.


----------



## HTC (May 13, 2019)

lsevald said:


> *What is driving the price of x570 boards up so much?* I guess we can previous generation boards if its that bad?



Possibly the lack of volume with the X570 chipsets? *I've read* the reason Zen 2 had been delayed was for it to wait for the chipset because the CPUs themselves were ready for a while, but i'm not 100% sure if this is true or not.

OTOH, it could very well be board makers attempting to squeeze us consumers ...


----------



## TheLostSwede (May 13, 2019)

Aquinus said:


> Expensive, sure, but TR expensive? Probably not. We'll know soon enough.



Probably not, but expect a $30-50 premium over current boards, if the board makers don't shoulder any of the cost.



Leaked said:


> Will the 16 Cores released later or with the other CPUs?



Unclear. AMD hasn't told the board makers the launch lineup as yet. But it's correct that there are 12 and 16 core parts with the board makers, as per various rumours.



lsevald said:


> What is driving the price of x570 boards up so much? I guess we can previous generation boards if its that bad?



The extra parts needed for PCIe 4.0. Two sets are needed for a full PCIe 4.0 board, one set minimum. These are in addition to the cost of the chipset itself. It would seem there's no issues to use current boards though, you just don't get the new board related features.



HTC said:


> Possibly the lack of volume with the X570 chipsets? *I've read* the reason Zen 2 had been delayed was for it to wait for the chipset because the CPUs themselves were ready for a while, but i'm not 100% sure if this is true or not.
> 
> OTOH, it could very well be board makers attempting to squeeze us consumers ...



No squeezing, it's all about PCIe 4.0. Extra re-drivers and re-timers are needed and these are costly for PCIe 4.0 right now, since very few products need them. It'll likely change over time.


----------



## juiseman (May 13, 2019)

I don't see 16 core CPU's as a bad thing; Parallel processing is the only way go from here until someone figures out how to 
overcome the current limitations. It seems they pushed the envelope as far they can already. So IPC and more cores is the only 
way to sell  & market something new for the chip makers. 
Things get hairy after 5GHZ-ish; efficiency goes down remarkably after that. Test results 
repeatedly show this. 
Just remember; we would all still be on 4 cores if AMD didn't force the 6 and 8 core CPU's into the mainstream.
AMD doing good is good for everybody. AMD or Intel fans. 
I'm all Intel now myself; but if AMD puts out a good CPU at a good price; I could make use of all those extra cores..
There is a large amount of people that would say the same.


----------



## xorbe (May 14, 2019)

I need this budget priced 16/32 cpu for some hobbyist linux box fun!


----------



## Aquinus (May 14, 2019)

TheLostSwede said:


> Probably not, but expect a $30-50 premium over current boards, if the board makers don't shoulder any of the cost.


To get 16c/32t for 300 USD less than the 2950x? I could live with that.


----------



## storm-chaser (May 15, 2019)

juiseman said:


> Just remember; we would all still be on 4 cores if AMD didn't force the 6 and 8 core CPU's into the mainstream.
> AMD doing good is good for everybody. AMD or Intel fans.
> I'm all Intel now myself; but if AMD puts out a good CPU at a good price; I could make use of all those extra cores..
> There is a large amount of people that would say the same.



While AMD did achieve something great with the release of their multi core CPUs, it was Intel who first released a quad core CPU to the market.
While I agree with you that AMD capitalized on the market by producing 6 and 8 core CPUs respectively, I don't think we'd still be stuck on four cores in the mainstream. Could be wrong though...


----------

