# AMD Ryzen 5 5600X Takes the Crown of the Fastest CPU in Passmark Single-Thread Results



## AleksandarK (Oct 23, 2020)

AMD has been improving its Zen core design, and with the latest Zen 3 IP found in Ryzen 5000 series CPUs, it seems like the company struck gold. Thanks to the reporting of VideoCardz, we come to know that AMD's upcoming Ryzen 5 5600X CPU has been benchmarked and compared to other competing offerings. In the CPU benchmark called PassMark, which rates all of the CPUs by multi-threaded and single-threaded performance, AMD's Ryzen 5 5600X CPU has taken the crown of the fastest CPU in the single-threaded results chart. Scoring an amazing 3495 points, it is now the fastest CPU for 1T workloads. That puts the CPU above Intel's current best—Core i9-10900K—which scores 3177 points. This puts the Zen 3 core about 10% ahead of the competition.

As a reminder, the AMD Ryzen 5 5600X CPU is a six-core, twelve threaded design that has a base clock of 3.7 GHz and boosts the frequency of the cores to 4.6 GHz, all within the TDP of 65 Watts. The CPU has 32 MB of level-3 (L3) cache and 3 MB of L2 cache.


 

 



*View at TechPowerUp Main Site*


----------



## Dredi (Oct 23, 2020)

Not really relevant to the news article, but I wonder how long it’ll take for userbenchmark to change its point system (again) after the launch? Will it become just a memory latency test? XD


----------



## Xuper (Oct 23, 2020)

Isn't passmark sensitive to memory latency or L3 cache?


----------



## s3thra (Oct 23, 2020)

I think I'll pick up one these up to replace my R5 2600. I'll wait for the review by W1zzard first though.


----------



## TheLostSwede (Oct 23, 2020)

At least in Passmark, it's trading blows with the Ryzen 7 3700X even in the full CPU test...


			PassMark - AMD Ryzen 5 5600X - Price performance comparison


----------



## PooPipeBoy (Oct 23, 2020)

It will be on top until the same benchmark is run on the higher-clocked Zen 3 chips, if the Cinebench R20 scores are anything to go by...


----------



## Space Lynx (Oct 23, 2020)

is the 5600x all on one ccx die? or is it chiplet like the 12 core and 8 core variants?


----------



## Dredi (Oct 23, 2020)

lynx29 said:


> is the 5600x all on one ccx die? or is it chiplet like the 12 core and 8 core variants?


One CCX on one CCD and connected to a separate IO-die.


----------



## ShurikN (Oct 23, 2020)

lynx29 said:


> is the 5600x all on one ccx die? or is it chiplet like the 12 core and 8 core variants?


6 core and 8 core parts are 1 CCX on one chiplet.


----------



## Dammeron (Oct 23, 2020)

Looks like 5900X will finally take over my 2600k's place, after 10 years of waiting for something good.


----------



## RedelZaVedno (Oct 23, 2020)

Very nice chip, but no way am I paying $300 for 6 cores in 2020/21. I'm gonna wait for 5600 and OC it. 5% more performance at best is not worth 80 bucks more imho.


----------



## ratirt (Oct 23, 2020)

RedelZaVedno said:


> Very nice chip, but no way am I paying $300 for 6 cores in 2020/21. I'm gonna wait for 5600 and OC it. 5% more performance at best is not worth 80 bucks more imho.


There is always a question, how the 5600x will OC. Another question is if the 5600 non-X will launch anyway.


----------



## oxrufiioxo (Oct 23, 2020)

TheLostSwede said:


> At least in Passmark, it's trading blows with the Ryzen 7 3700X even in the full CPU test...
> 
> 
> PassMark - AMD Ryzen 5 5600X - Price performance comparison




Ouch that 9700K is looking pretty bad and I'm sure not even my 9900K is overly safe


----------



## Super XP (Oct 23, 2020)

And that's not the top of the line ZEN 3 either. 
AMD is going to sway lots of PC Gamers onto ZEN3 & RDNA2. 
Fair Competition is Great.



Dredi said:


> Not really relevant to the news article, but I wonder how long it’ll take for userbenchmark to change its point system (again) after the launch? Will it become just a memory latency test? XD


Userbenchmark changed a lot of things because Intel & Nvidia were bickering about it. They need to leave it alone and test the raw performance and not favor Intel & Nvidia anymore.



oxrufiioxo said:


> Ouch that 9700K is looking pretty bad and I'm sure not even my 9900K is overly safe


Time for a ZEN3 upgrade


----------



## Metroid (Oct 23, 2020)

I dont know what lisa is doing but she is doing it right, new management, new plan, new thinking, new products, new records, new highs, new performance. When lisa got in, amd was so far behind intel and nvidia that only a miracle would turn the tides. Good work lisa.

I hope msi releases a bios for my b450 before january 2021 so i can purchase that 5600x or the 5900x.


----------



## kapone32 (Oct 23, 2020)

lynx29 said:


> is the 5600x all on one ccx die? or is it chiplet like the 12 core and 8 core variants?


It's the big brother of the 3300X.


----------



## Mats (Oct 23, 2020)

oxrufiioxo said:


> Ouch that 9700K is looking pretty bad and I'm sure not even my 9900K is overly safe


The 9700K seems to be locked at 3.6 GHz? Weird.

Also, I don't remember the 9900K (also locked here) being inferior to the 3700X.. looks like BS to me.


----------



## oxrufiioxo (Oct 23, 2020)

Super XP said:


> Time for a ZEN3 upgrade



Oh definitely either going  5950X or 5900X since I have a very nice X570 Aorus Master just sitting in a box.......



Mats said:


> The 9700K seems to be locked at 3.6 GHz? Weird.
> 
> Also, I don't remember the 9900K (also locked here) being inferior to the 3700X..
> View attachment 173050



its whatever the base clock is..... Otherwise these are just stock results.


----------



## Goflying (Oct 23, 2020)

Dammeron said:


> Looks like 5900X will finally take over my 2600k's place, after 10 years of waiting for something good.


Wow... You can use it for 10 years!?!


----------



## Mats (Oct 23, 2020)

oxrufiioxo said:


> its whatever the base clock is..... Otherwise these are just stock results.


Nah there's something wrong here.

Has the 3700X ever been* 21 % faster* than the 9900K in anything?


----------



## oxrufiioxo (Oct 23, 2020)

Mats said:


> Nah there's something wrong here.
> 
> Has the 3700X ever been* 21 % faster* than the 9900K in anything?




When using the default intel spec for power limits its pretty shitty only slightly faster than a 2700X in MT workloads....... Gaming isn't Affected though. Not sure about 21% though either but most Reviews reviewed the chip with all limits removed.


----------



## Mats (Oct 23, 2020)

oxrufiioxo said:


> When using the default intel spec for power limits its pretty shitty only slightly faster than a 2700X in MT workloads.......


Yeah I know, but that's a big if in this context given that it's based on 7000 benchmarks.

I've been thinking for a while now that passmark is just like geekbench:* it doesn't really tell us anything.*


----------



## oxrufiioxo (Oct 23, 2020)

Mats said:


> Yeah I know, but that's a big if in this context given that it's based on 7000 benchmarks.
> 
> I've been thinking for a while now that passmark is like geekbench:* it doesn't really tell us anything.*




I agree its not quite as bad as userbenchmarks but its not a good tool to base any purchasing decision on. As always wait for the usual suspects including TPU to get the whole and much more accurate picture.


----------



## Turmania (Oct 23, 2020)

Are they gimping intel for the charts? Seems like they fix intel at their base speeds. Something is off here. I would rather wait for proper reviews.


----------



## Dredi (Oct 23, 2020)

Mats said:


> Nah there's something wrong here.
> 
> Has the 3700X ever been* 21 % faster* than the 9900K in anything?


It is some 40% faster in 7zip compression. I’m sure there are some other specific loads where the difference is over 20%.


----------



## oxrufiioxo (Oct 23, 2020)

Turmania said:


> Are they gimping intel for the charts? Seems like they fix intel at their base speeds. Something is off here. I would rather wait for proper reviews.




I think the only thing this proves is a lot of intel owners run their chips on crappy motherboards with crappy configurations.....


----------



## Turmania (Oct 23, 2020)

From what I see is they fix all intel cpu's at their base speeds and let 5600x run lose at its normal speeds. Because charts certainly does not state they fix 5600x cpu. If that is true, this is one of the worst misleading and false marketing Inhave witnessed from AMD. Taking people as stupid. But perhaps I'm misreading something and refuse to believe my suspicions for now.


----------



## Mats (Oct 23, 2020)

Dredi said:


> It is some 40% faster in 7zip compression. I’m sure there are some other specific loads where the difference is over 20%.


TPU says 15.5 %.
Anyway, I shouldn't have phrased it like that because benchmarks for specific tasksvaries a lot. If looking at overall performance, 21 % is just too much.
In TPU's review the 9900K comes out on top with 3 %.



Turmania said:


> From what I see is they fix all intel cpu's at their base speeds and let 5600x run lose at its normal speeds. Because charts certainly does not state they fix 5600x cpu.


Unless the benchmark is somehow capable of locking the clock speed on said CPU's I find it highly unlikely, the 9900K benchmark is an average based on 7000 benchmarks made by users.
I think it's just a bad benchmark.


Turmania said:


> If that is true, this is one of the worst misleading and false marketing Inhave witnessed from AMD.


You think AMD owns passmark?


----------



## Valantar (Oct 23, 2020)

TheLostSwede said:


> At least in Passmark, it's trading blows with the Ryzen 7 3700X even in the full CPU test...
> 
> 
> PassMark - AMD Ryzen 5 5600X - Price performance comparison


Wow, that's a 75% increase from my 1600X. Even for three generations later, that's very, very impressive (especially considering the 30W TDP drop).


----------



## RedelZaVedno (Oct 23, 2020)

ratirt said:


> There is always a question, how the 5600x will OC. Another question is if the 5600 non-X will launch anyway.


AMD has to offer something around 200 bucks... It's the most commonly bought price range. 5600X is overpriced imho. I'd rather go with 8C i7 10700 (non K)  for that money.


----------



## Turmania (Oct 23, 2020)

Mats said:


> TPU says 15.5 %.
> Anyway, I shouldn't have phrased it like that because benchmarks for specific tasksvaries a lot. If looking at overall performance, 21 % is just too much.
> In TPU's review the 9900K comes out on top with 3 %.
> 
> ...



Why do they they desigante intel cpu`s with ` @ xxxx speeds ` and not their single own ? f*** it, I will just wait for the reviews. I do expect it to outperfrom rival product though.


----------



## Sithaer (Oct 23, 2020)

Valantar said:


> Wow, that's a 75% increase from my 1600X. Even for three generations later, that's very, very impressive (especially considering the 30W TDP drop).



You are making my 1600X feel bad now.

Tbh I'm kinda curious about the cheaper models later, probably next year or so. _~200$ range max_.

Those might make me retire the 1600x/B350 after 3 years of using them.


----------



## Turmania (Oct 23, 2020)

RedelZaVedno said:


> AMD has to offer something around 200 bucks... It's the most commonly bought price range. 5600X is overpriced imho. I'd rather go with 8C i7 10700 (non K)  for that money.


Judging by their products, and $300 for their 6 core cpu, I expect a 4 core 8 thread cpu at $200 segment. they really upped the prices, and are very confident they will rule the competition. say, ryzen 3 5300x for $ 200.


----------



## freeagent (Oct 23, 2020)

Good job AMD. I honestly never thought I would see the day. 

Took you long enough!

Did you ditch the ****ing pins for pads yet?

I always said I would only go back to AMD when they can beat Intel.. in everything.

I wonder if they are stable like a table now..? The last time I rolled with them they were ok for the most part, but not what I'm used to now.


----------



## Dredi (Oct 23, 2020)

Mats said:


> TPU says 15.5 %.
> Anyway, I shouldn't have phrased it like that because benchmarks for specific tasksvaries a lot. If looking at overall performance, 21 % is just too much.
> In TPU's review the 9900K comes out on top with 3 %.


On toms hw the difference is 40%. Maybe they run the processor on stock settings, or the SW version was different? In any case there are cases where the difference is higher than 20%.


----------



## Valantar (Oct 23, 2020)

Sithaer said:


> You are making my 1600X feel bad now.
> 
> Tbh I'm kinda curious about the cheaper models later, probably next year or so. _~200$ range max_.
> 
> Those might make me retire the 1600x/B350 after 3 years of using them.





Turmania said:


> Judging by their products, and $300 for their 6 core cpu, I expect a 4 core 8 thread cpu at $200 segment. they really upped the prices, and are very confident they will rule the competition. say, ryzen 3 5300x for $ 200.


There have been rumors of a ~$220 5600 non-X, and frankly it would be really, really weird for AMD not to make such a SKU. $200 for 4c8t even at these performance levels isn't good enough in 2020.


----------



## Sithaer (Oct 23, 2020)

Valantar said:


> There have been rumors of a ~$220 5600 non-X, and frankly it would be really, really weird for AMD not to make such a SKU. $200 for 4c8t even at these performance levels isn't good enough in 2020.



To be honest I would be okay with a decently priced 4/8 say 5300x, not for 200$ tho.
I'm mainly after the single core perf as the games I'm playing rarely multi thread well and I don't do work/CPU heavy related stuff on my PC either.


----------



## BoboOOZ (Oct 23, 2020)

Turmania said:


> Judging by their products, and $300 for their 6 core cpu, I expect a 4 core 8 thread cpu at $200 segment. they really upped the prices, and are very confident they will rule the competition. say, ryzen 3 5300x for $ 200.


You'd be better off with a 3700x for that price.
New games will like 16 threads, it would be naive to "upgrade" to a 4 core in 2021.


----------



## Turmania (Oct 23, 2020)

Sithaer said:


> To be honest I would be okay with a decently priced 4/8 say 5300x, not for 200$ tho.
> I'm mainly after the single core perf as the games I'm playing rarely multi thread well and I don't do work/CPU heavy related stuff on my PC either.


I agree at most a chip like that should be no more than 150. considering, 3300x sold for 125 though you can not get your hands on it for months now...
but intel i3 10320 offering, which is their speediest chip on 4 cores, 8 threads is around 175 now and AMD surely things their range is better and charging more than intel offerings currently, so I would not surprised at $200! not saying, I would buy it, as I wont at that price. but these are the chips that sells and makes the most profit for both companies. not your 400 usd chip.


----------



## Mats (Oct 23, 2020)

Turmania said:


> Why do they they desigante intel cpu`s with ` @ xxxx speeds ` and not their single own ?


Their???

This doesn't come from AMD, how many times do I have to tell you? 



Dredi said:


> On toms hw the difference is 40%. Maybe they run the processor on stock settings, or the SW version was different?


TPU run on stock, and I'd guess Toms does as well.



Dredi said:


> In any case there are cases where the difference is higher than 20%.


That was my point. There are always extremes, but passmark is supposed (AFAIK) to show overall performance, not extremes. That's a big difference.
Otherwise it would be like saying which graphics card is best based on *one* gaming benchmark.


----------



## Sithaer (Oct 23, 2020)

Turmania said:


> I agree at most a chip like that should be no more than 150. considering, 3300x sold for 125 though you can not get your hands on it for months now...
> but intel i3 10320 offering, which is their speediest chip on 4 cores, 8 threads is around 175 now and AMD surely things their range is better and charging more than intel offerings currently, so I would not surprised at $200! not saying, I would buy it, as I wont at that price. but these are the chips that sells and makes the most profit for both companies. not your 400 usd chip.



3300x was around 160$ in my country before it vanished, 3600 ~200 before the price increase. _'~230 nowadays_'

It really comes down to the pricing, sure I could always grab a 3600 from the second hand market but thats like my last option preferably._ 'This performance increase with the 5000 serie is also hard to ignore'_

Not in a hurry tho, saving up for a GPU atm and maybe new CPU around 2021 summer. _'thats when both my mobo+cpu will be 3 years old and thats when I usually upgrade'_


----------



## voltage (Oct 23, 2020)

took them long enough. decades in fact.


----------



## Mats (Oct 23, 2020)

voltage said:


> took them long enough. decades in fact.


No wonder, it's not like AMD has the same resources.


----------



## Makaveli (Oct 23, 2020)

Goflying said:


> Wow... You can use it for 10 years!?!



I ran a i7-970 for 10 years wasn't that difficult to do.

So went from Westmere/gulftown and skipped everything after that and went to Zen 2 soon to be Zen 3.

I only upgraded my rig in dec 2019.


----------



## Patuga (Oct 23, 2020)

Makaveli said:


> I ran a i7-970 for 10 years wasn't that difficult to do.
> 
> So went from Westmere/gulftown and skipped everything after that and went to Zen 2 soon to be Zen 3.
> 
> I only upgraded my rig in dec 2019.



i still have one of my rigs with one running on a Rampage 3 Black edition


----------



## Postmodum (Oct 23, 2020)

i guess,  to upgrade my  3900X i would go 5950X next time  ...

Amazing performance for the 5600X, but my 3900X can still get me the full FPS that my monitor outputs  (165Hz)  decisions decision ...


----------



## HD64G (Oct 23, 2020)

Valantar said:


> There have been rumors of a ~$220 5600 non-X, and frankly it would be really, really weird for AMD not to make such a SKU. $200 for 4c8t even at these performance levels isn't good enough in 2020.


They need to wait for a few months to gather enough low-binned chiplets from the wafers made to have enough chips to start selling that product as it will be sold as hot bread.


----------



## Dredi (Oct 23, 2020)

Mats said:


> TPU run on stock, and I'd guess Toms does as well.
> 
> 
> That was my point. There are always extremes, but passmark is supposed (AFAIK) to show overall performance, not extremes. That's a big difference.
> Otherwise it would be like saying which graphics card is best based on *one* gaming benchmark.


Toms 5ghz oc result was pretty close to the number 9900k got on TPU. Also the ryzen result is here significantly lower. Anyway, the sw version is different too.

I am with you on that this is a bit different already. Intel has previously dominated these silly microbenches and this indicates that the performance gains are going to be huge.


----------



## Valantar (Oct 23, 2020)

HD64G said:


> They need to wait for a few months to gather enough low-binned chiplets from the wafers made to have enough chips to start selling that product as it will be sold as hot bread.


Oh, absolutely, they have no reason to push out very cut-down chips early - keeping the highest margin products exclusively available for a few months both allows for building up stocks of lower end, higher volume ones, building up demand by having the most attractive products already out there, and by making some impatient people spend more than they would otherwise. It's a win-win-win scenario for AMD. If it weren't for the price hikes it would also be decent towards users, though with those taken into consideration it's just a clear statement from AMD that "we're not the budget option any more".



Dredi said:


> Toms 5ghz oc result was pretty close to the number 9900k got on TPU. Also the ryzen result is here significantly lower. Anyway, the sw version is different too.


One possible explanation is motherboard boost settings. What "stock" means here is open to interpretation after all. Intel's specs, the motherboard manufacturers' settings, or something else entirely? Stock on a board with MCE active vs stock on a board where it isn't might be the difference between 5.1GHz all-core and much lower clocks.

Different software versions is also pretty huge though, especially for specific workloads like compression. Might be that one of them has optimizations included that the other lacks.


----------



## ARF (Oct 23, 2020)

Dredi said:


> Not really relevant to the news article, but I wonder how long it’ll take for userbenchmark to change its point system (again) after the launch? Will it become just a memory latency test? XD



Rocket Lake is coming and will regain that single-threaded performance crown, despite the fact that in a multi-threaded world the single-thread performance DOES NOT matter.


----------



## GhostRyder (Oct 23, 2020)

(Raises eyebrow)
Wow, well that maybe an interesting pickup for my portable rig since it has an X570 board.  Might want to try it out.

However, I am curious with both being overclocked what the results are and if that changes anything as while both brands are pushing their chips to their limits.


----------



## rodneyhchef (Oct 23, 2020)

Dammeron said:


> Looks like 5900X will finally take over my 2600k's place, after 10 years of waiting for something good.





Goflying said:


> Wow... You can use it for 10 years!?!



I’m also still running a 2600k! Next year it will be 10 years since i bought the CPU. Next year I will be upgrading


----------



## efikkan (Oct 23, 2020)

Let's wait for proper benchmarks, but 5600X and 5800X will be the two models to watch for most of you.
12 and 16 cores are really only relevant for those who run specific workloads which scales beyond 8 cores. Don't get fixated on synthetic benchmarks or benchmarks which are not relevant to you.



ARF said:


> Rocket Lake is coming and will regain that single-threaded performance crown, despite the fact that in a multi-threaded world the single-thread performance DOES NOT matter.


Quite on the contrary; single threaded performance is more important than ever, and is after all the base scaling factor for multithreaded performance.
Single threaded performance also helps "everything", while more cores only helps certain workloads.


----------



## Imsochobo (Oct 23, 2020)

Turmania said:


> From what I see is they fix all intel cpu's at their base speeds and let 5600x run lose at its normal speeds. Because charts certainly does not state they fix 5600x cpu. If that is true, this is one of the worst misleading and false marketing Inhave witnessed from AMD. Taking people as stupid. But perhaps I'm misreading something and refuse to believe my suspicions for now.



It's tests from several sources such as users, it's not in house testing they have control over just like userbenchmark.
They cannot control what users do!!!

The weighting can be found on their forums.
The baselines for each model can be found easily.

Baseline URL.

This is one of the results of a 10600K used for an aggregate which is what you see, Clearly states it turbo'd to 4.8 ghz.


----------



## Fabio (Oct 23, 2020)

RedelZaVedno said:


> Very nice chip, but no way am I paying $300 for 6 cores in 2020/21. I'm gonna wait for 5600 and OC it. 5% more performance at best is not worth 80 bucks more imho.


I get an 8700k 3 years ago for 350, and mine, locked at 4.9 ghz still not bad compared to that new 5600x. Its 3150 single vs 3500. In game probably they will be very close.
8700k on 2017 was really a good cpu for the time.


----------



## michwoz (Oct 23, 2020)

Dredi said:


> Not really relevant to the news article, but I wonder how long it’ll take for userbenchmark to change its point system (again) after the launch? Will it become just a memory latency test? XD



I've been wondering the same thing. I also expect that memory latency will suck much less on Zen 3.


----------



## TheEndIsNear (Oct 23, 2020)

Dammeron said:


> Looks like 5900X will finally take over my 2600k's place, after 10 years of waiting for something good.



What a legendary chip that was.  Just like the Celeron 300 I had back in the day.  Oh and not to forget the Abit BP-6 I think with the dual celerons and NT 4 playing quake 2 online.  Ahhh memories.


----------



## Valantar (Oct 23, 2020)

Turmania said:


> From what I see is they fix all intel cpu's at their base speeds and let 5600x run lose at its normal speeds. Because charts certainly does not state they fix 5600x cpu. If that is true, this is one of the worst misleading and false marketing Inhave witnessed from AMD. Taking people as stupid. But perhaps I'm misreading something and refuse to believe my suspicions for now.


That is nonsense. These are crowd-sourced benchmark numbers, they have zero control over the clock speeds of the CPUs in question. _That_ is of course a significant negative and an argument against the reliability of this data, which thankfully they also list prominently on the results page. It's entirely possible that this is an OC score (though highly unlikely given that overclocking Ryzen for the past three generations has actually meant giving up ST performance outright due to the loss of single core boost, in favor of possibly higher multi core performance), but given the amount of 10900k scores we can absolutely know that they are a representative score in that benchmark for that part. Is it representative of real-world performance? Only if the workload resembles this benchmark, as with all benchmarks. If there is a discrepancy in the reported clocks (Intel reported at base clocks and AMD at boost clocks) that is likely down to how the system reports data to the benchmark, and nothing else - those numbers aren't actual measurements of clock speeds while running the benchmark.


----------



## cyberloner (Oct 23, 2020)

hmm fx8350 for 8 years....so attracted by this cpu...


----------



## RandallFlagg (Oct 24, 2020)

What I find most interesting in that chart is how a Tiger Lake 15-28W laptop chip is within 1% of the 5600X in single thread.



PooPipeBoy said:


> It will be on top until the same benchmark is run on the higher-clocked Zen 3 chips, if the Cinebench R20 scores are anything to go by...


----------



## Imsochobo (Oct 24, 2020)

RandallFlagg said:


> What I find most interesting in that chart is how a Tiger Lake 15-28W laptop chip is within 1% of the 5600X in single thread.



Intel laptop designs tend to consume as much power as desktops for 1t loads.
Desktops have more cooling so you can turbo for longer and you have a bigger T delta allowing for slightly higher frequencies.

Tigerlake is atleast some breath of hope..


----------



## RandallFlagg (Oct 24, 2020)

Imsochobo said:


> Intel laptop designs tend to consume as much power as desktops for 1t loads.
> Desktops have more cooling so you can turbo for longer and you have a bigger T delta allowing for slightly higher frequencies.
> 
> Tigerlake is atleast some breath of hope..



References for that power comment?  Only the i7-10750H and higher pull up 65W, and that only for ~30 seconds, just like the Renoir Zen 2 laptop chips.  And they only do that in multi-core, not single core.  Then they drop back to their rated 45W unless you change that in the BIOS.  The 5600X is 65W TDP meaning that is most likely its sustained PL1 power draw, I would imagine like most chips (Intel and AMD) its PL2 turbo is probably more like 95W.   Not sure how much more wrong you could be here.

Tiger Lake will pull around 54W PL2 for a short time, like 30s, then pull back down to sustained power draw PL1 = 28W.  Cinebench is not a short benchmark, so you are talking about a chip that draws around or less than half the power of the 65W TDP 5600X.  And it scored within 1% on single thread Cinebench of that 5600X.  Hence my comment.

To that point, the Ryzen 4900HS in the Asus Rog G14 - the flagship high performance model for the Ryzen, probably about the fastest you could get 3 months ago - got smoked by in an Ars review of the Tiger Lake based MSI Summit.


----------



## ARF (Oct 24, 2020)

efikkan said:


> Quite on the contrary; single threaded performance is more important than ever, and is after all the base scaling factor for multithreaded performance.
> Single threaded performance also helps "everything", while more cores only helps certain workloads.



Why don't you try to work with a single-core processor if the single-thread performance is so important?
The applications are thread-count starved - we all need more cores because we are always limited by the speed of execution of a single thread, the only way to overcome this limit is to use more cores.

Oh, and to be honest - this PassMark thread is heavily AMD optimised.

Next time try with Microsoft Office, Adobe Photoshop, WinRar compression, Super Pi and other beautiful benchmarks where AMD CPUs get slaughtered.


----------



## PooPipeBoy (Oct 24, 2020)

ARF said:


> Why don't you try to work with a single-core processor if the single-thread performance is so important?
> The applications are thread-count starved - we all need more cores because we are always limited by the speed of execution of a single thread, the only way to overcome this limit is to use more cores.



Ah, single-threaded performance must mean nothing then. Gotcha.

Anyone know where I could buy a 128 core processor to improve my computer's performance by 3000%?


----------



## efikkan (Oct 24, 2020)

ARF said:


> Why don't you try to work with a single-core processor if the single-thread performance is so important?


Despite being so active in these deeply technical subjects, you clearly don't know what "single threaded performance" means.
You should think of single threaded performance as _performance *per* core_, because that's what it really is, and forms the theoretical upper limit of multithreaded scaling; cores * performance per core.

All workloads split over multiple cores will encounter diminishing returns with increased core count, as synchronizing more cores is inevitably going to take more time. This overhead might not be significant if you're doing a large batch job that takes minutes or even hours, but if it's an interactive application or a game, then you have a very critical time limit before the application becomes laggy and non-responsive. Since there is an overhead cost with each thread you synchronize, balancing how threads share data and the size of work chunks is essential for good multithreaded performance. In such cases faster cores will lead to better utilization and less stalls and lag, essentially you can scale to more cores before performance gains become negligible.

As I said in my previous post, single threaded performance helps "everything". Whether an application uses 1 or 128 threads, an increase in single threaded performance is nearly always going to benefit a computational workload, and sometimes even help multithreaded performance even more due to less overhead.



ARF said:


> The applications are thread-count starved - we all need more cores because we are always limited by the speed of execution of a single thread, the only way to overcome this limit is to use more cores.


If an application is in fact starved for more threads, then more threads are good.
But applications have to be carefully designed to scale well. Large batch jobs are "easy", while applications like Phoshop etc. are harder. That's why you often see with such applications that more cores helps a little up to a point, but faster cores always help.
Unless an application benefits from thread isolation (which some web server tasks do), faster cores are always going to perform better than more cores. If you have the option between a CPU that has 50% more cores or one that's 50% faster per core, the latter will nearly always win.



ARF said:


> Oh, and to be honest - this PassMark thread is heavily AMD optimised.


Typical rookie mistake.
There are really no such thing as "AMD optimized" or "Intel optimized".
Just because a piece of software is performing better on one specific piece of hardware, doesn't mean it's "optimized" for it. In 99.9% of cases it simply comes down to the resource balance. The exception are the few cases where unique ISA features are utilized, or the application intentionally runs a slower code path for certain hardware.


----------



## AMD718 (Oct 24, 2020)

Dammeron said:


> Looks like 5900X will finally take over my 2600k's place, after 10 years of waiting for something good.



This is me. 10 years with a 2600k oc'd to 4.9 GHz. Will be moving to 5900 or 5950.


----------



## Dredi (Oct 24, 2020)

RandallFlagg said:


> References for that power comment?  Only the i7-10750H and higher pull up 65W, and that only for ~30 seconds, just like the Renoir Zen 2 laptop chips.  And they only do that in multi-core, not single core.  Then they drop back to their rated 45W unless you change that in the BIOS.  The 5600X is 65W TDP meaning that is most likely its sustained PL1 power draw, I would imagine like most chips (Intel and AMD) its PL2 turbo is probably more like 95W.   Not sure how much more wrong you could be here.
> 
> Tiger Lake will pull around 54W PL2 for a short time, like 30s, then pull back down to sustained power draw PL1 = 28W.  Cinebench is not a short benchmark, so you are talking about a chip that draws around or less than half the power of the 65W TDP 5600X.  And it scored within 1% on single thread Cinebench of that 5600X.  Hence my comment.
> 
> ...


The highest bin tiger lake consumes about 30W @4,8GHz on a medium intensity 1T load according to anandtechs fairly comprehensive tests. In contrast zen2 can’t really use more than 18W per core (max 20W on renoir with uncore etc, matisse is not really power optimized and a lot of power goes to IF & IO die).

Rocket lake is going to be the juice king. 5GHz all core around 320W in cinebench like loads, prime95 will be higher.


----------



## birdie (Oct 24, 2020)

I know AMD fans are super excited but I'd love to shit on your parade. You know why? AMD has finally outperformed an Intel uArch from ... 2015. This might sound like a great achievement but honestly it's just because Intel has completely f*ed up their 10nm transition. Yeah, their latest 10nm++ node (first - Cannon Lake, second - Ice Lake and now Tiger Lake) allows to boost to 4.8GHz at the expense of insane power consumption and they've made changes to the Willow Cove Core architecture which sometimes translate to a lower performance than Ice Lake:






In short Intel has turned from an indisputable x86 performance leader to something else entirely and AMD has quickly seized the opportunity to significantly increase their prices. An entry level Ryzen 5000 CPU, Ryzen 5 5600X, is now 50% (!) more expensive than its Ryzen 3000 counterpart, Ryzen 5 3600. There's nothing to be excited about. One struggling monopoly has been replaced by another.


----------



## Zach_01 (Oct 24, 2020)

birdie said:


> I know AMD fans are super excited but I'd love to shit on your parade. You know why? AMD has finally outperformed an Intel uArch from ... 2015. This might sound like a great achievement but honestly it's just because Intel has completely f*ed up their 10nm transition. Yeah, their latest 10nm++ node (first - Cannon Lake, second - Ice Lake and now Tiger Lake) allows to boost to 4.8GHz at the expense of insane power consumption and they've made changes to the Willow Cove Core architecture which sometimes translate to a lower performance than Ice Lake:
> 
> 
> 
> ...


The only thing you’re trying to Sh1t on, is our intellect. But ain’t happening... you can keep them by your bedside

Intel has done this to it self because it was sitting on the big throne for years and didn’t do anything really to innovate the consumer market. Only trying to get user’s money. I guess you liked paying for an i7 and get an i3 all those years.

The 5600X is not an entry level CPU. A CPU that out performers all previous Zen CPUs and most probably Intels also on single thread and match a lot of middle parts with maybe more cores, even Intels, in multi thread, for 300$...
5600X is not replacing 3600nonX
Core count does not determine level of products. Performance does.

Thank fully AMD has now offerings that shifted the entire market and Intel has woken from the eternal sleep and trying put its act/sh1t together, unlike you...

So before you say anything about a +50$ in price for top line products remember how Intel cut 500$ to try to match AMD’s
You would still pay 1000$ for 500$ CPU.

So keep your act and shit together and to your self...
We don’t need that here.


----------



## birdie (Oct 24, 2020)

Zach_01 said:


> Core count does not determine level of products. Performance does.



This is exactly why Ryzen 1000/2000 CPUs got popular in the first place - MOAR cores than Intel at a reasonable price. How fast AMD fans have forgotten everything.

The 5600X is an entry level CPU as AMD hasn't yet announced anything cheaper/simpler. And it's not just $50 more expensive, it's $100 more expensive, or 50%. If Intel had done anything like that for their entry level CPUs people would have torn them apart! And being marginally faster in lots of workloads than Sky Lake from 2015 allows them to dictate insane prices? I don't want to participate in this discussion any longer. It's just dirty. One corporation ripping off it's customers while offering the highest performance? Bad, bad, bad! An underdog now ripping off their customers by offering the highest performance? That's totally OK. F it.


----------



## Valantar (Oct 24, 2020)

birdie said:


> I know AMD fans are super excited but I'd love to shit on your parade. You know why? AMD has finally outperformed an Intel uArch from ... 2015. This might sound like a great achievement but honestly it's just because Intel has completely f*ed up their 10nm transition. Yeah, their latest 10nm++ node (first - Cannon Lake, second - Ice Lake and now Tiger Lake) allows to boost to 4.8GHz at the expense of insane power consumption and they've made changes to the Willow Cove Core architecture which sometimes translate to a lower performance than Ice Lake:
> 
> 
> 
> ...





birdie said:


> This is exactly why Ryzen 1000/2000 CPUs got popular in the first place - MOAR cores than Intel at a reasonable price. How fast AMD fans have forgotten everything.
> 
> The 5600X is an entry level CPU as AMD hasn't yet announced anything cheaper/simpler. And it's not just $50 more expensive, it's $100 more expensive, or 50%. If Intel had done anything like that for their entry level CPUs people would have torn them apart! And being marginally faster in lots of workloads than Sky Lake from 2015 allows them to dictate insane prices? I don't want to participate in this discussion any longer. It's just dirty. One corporation ripping off it's customers while offering the highest performance? Bad, bad, bad! An underdog now ripping off their customers by offering the highest performance? That's totally OK. F it.


Just because it's the lowest tier product launched as of now does not make it "entry level" - by that logic the RTX 3080 is an entry level GPU ...
Just like previous generations, we can expect a 6c12t 5600 non-X (likely in the $230 range), a 6c6t or 4c8t 5500(X), and so on.

Beyond that, while there is a smidgen of truth to what you're saying, you're twisting it way past what's reasonable. Some issues:
-Zen 2 already surpassed Skylake and its derivatives in terms of IPC, beating them by ~7% in AnandTech's testing. Latency-sensitive workloads like gaming was one area where Intel still had the upper hand, but it was also the only area.
- Intel is already hitting >5GHz on 14nm, so saying their 10++ node allows for 4.8GHz is ... a bit weird? You're also skipping over the fact that Ice Lake improved IPC over Skylake by ~18%, which Tiger Lake carries forward (though with much improved clocks thanks to the improved 10++ node), both of which Zen 3 should now handily surpass again. So Zen 3 isn't surpassing an architecture from 2015, but one from 2020.
- Intel's issues don't just stem from their messed up 10nm node, but also their architectural development plans and their failure to improve upon Skylake for far too long when they knew 10nm wasn't panning out. (And especially that it took them _five years_ to backport a  newer core design to 14nm.)

As for saying "One struggling monopoly has been replaced by another" - please come back in five or more years. Taking a leading market position (which AMD still doesn't have, just to be clear), does not make you a monopolist. Far from it. Your arguments here are ridiculously simplistic.

I understand your frustrations regarding the higher prices, but there's also reason behind that. AMD has sold themselves as the value option with Ryzen 1000-3000. Now, with Ryzen 5000, they see no reason to present themselves that way, instead selling themselves as the overall performance champion. Thus they have no reason to price themselves lower than Intel - the value comes with the better performance, not the lower price. Does this suck for end users now accustomed to cheap many-core CPUs from AMD? Of course it does! But it is nowhere near a reasonable threshold for being called a ripoff. You're getting your money's worth, after all. It's not like they are returning to $400 4c8t CPUs like we had before Ryzen...
-


----------



## birdie (Oct 24, 2020)

Valantar said:


> Just because it's the lowest tier product launched as of now does not make it "entry level" - by that logic the RTX 3080 is an entry level GPU ...



All three vendors AMD, NVIDIA and Intel have a naming scheme they follow closely. XX80 products from NVIDIA have always been top tier, starting with GeForce4 4800Ti. Ryzen XX60 CPUs have always been entry-level, Ryzen 5 1600, Ryzen 5 2600, Ryzen 5 3600 and now Ryzen 5 5600X. Again, if Intel had done anything like that, people would have torn them apart and they had the performance crown for more than a decade.

The Core i3 6100, much faster in single-threaded mode than anything from AMD at that time was sold for $117.
The Core i3 4130 before it, $122.

Why didn't Intel sell the Core i3 6100 for $183? It was the fastest entry level CPU at that time!

Double effing standards and hypocrisy from AMD fans all the effing time even when their idol starts ripping off (Ryzen 5 3600 $200, Ryzen 5 5600 with the same number of cores $300).



Valantar said:


> Now, with Ryzen 5000, they see no reason to present themselves that way, instead selling themselves as the overall performance champion.



Even the most evil company in the world, Intel, didn't allow itself to do that as indicated earlier. F it and I'm out.

Speaking of monopolies. Yes, AMD is playing like a monopoly. They've got the highest performance and they've started dictating prices which indicate they have no competition. Again, refer to my example at the beginning of the post: Intel did not allow itself to increase prices between generations for similar products, except when they started to offer significantly more cores. AMD has increased the price of their entry level CPU by whopping 50%, not $50 you keep mentioning.


----------



## kapone32 (Oct 24, 2020)

cyberloner said:


> hmm fx8350 for 8 years....so attracted by this cpu...


Prepare to have a smile on your face every single time you press the power button.


----------



## Zach_01 (Oct 24, 2020)

birdie said:


> This is exactly why Ryzen 1000/2000 CPUs got popular in the first place - MOAR cores than Intel at a reasonable price. How fast AMD fans have forgotten everything.
> 
> The 5600X is an entry level CPU as AMD hasn't yet announced anything cheaper/simpler. And it's not just $50 more expensive, it's $100 more expensive, or 50%. If Intel had done anything like that for their entry level CPUs people would have torn them apart! And being marginally faster in lots of workloads than Sky Lake from 2015 allows them to dictate insane prices? I don't want to participate in this discussion any longer. It's just dirty. One corporation ripping off it's customers while offering the highest performance? Bad, bad, bad! An underdog now ripping off their customers by offering the highest performance? That's totally OK. F it.





birdie said:


> All three vendors AMD, NVIDIA and Intel have a naming scheme they follow closely. XX80 products from NVIDIA have always been top tier, starting with GeForce4 4800Ti. Ryzen XX60 CPUs have always been entry-level, Ryzen 5 1600, Ryzen 5 2600, Ryzen 5 3600 and now Ryzen 5 5600X. Again, if Intel had done anything like that, people would have torn them apart and they had the performance crown for more than a decade.
> 
> The Core i3 6100, much faster in single-threaded mode than anything from AMD at that time was sold for $117.
> The Core i3 4130 before it, $122.
> ...


Yes yes, RTX3070 anounced by nVidia is the entry level GPU that replaces GTX 1650 and has a price bump of 300%.

*Are we in kindergarden here?*

The 1600/2600/3600/5600 is the entry level CPUs of AMD... Right...!!!
*And the 1200/1300/1400/1500/3100/3300 what exactly are? Sub-entry level or non existent CPUs?*
You can cry all you want. 5600X is replacing 3600X and has price bump of 50$. That is a +20% on MSRP with at least the same performance uplift and most probably performs faster than any 6core.

Please... you can try harder than this.


----------



## birdie (Oct 24, 2020)

Zach_01 said:


> Yes yes, RTX3070 anounced by nVidia is the entry level GPU that replaces GTX 1650 and has a price bump of 300%.
> 
> *Are we in kindergarden here?*
> 
> ...



I'm ignoring your posts from now on. You've failed to address the fact that Intel doesn't allow itself to raise prices when they release faster better products. You're trying to compare the 5600X to the 3600X which wasn't the entry level CPU, it was the 3600 which cost $200, so the difference is not $50 but $100, i.e. whopping 1.5 times. Good luck with AMD a-licking and vindicating their monopolistic behavior (because it is what is is). What's bad for Intel and NVIDIA, is totally OK for AMD. I get it, now I have nothing else to talk with you about.


----------



## Valantar (Oct 24, 2020)

birdie said:


> All three vendors AMD, NVIDIA and Intel have a naming scheme they follow closely. XX80 products from NVIDIA have always been top tier, starting with GeForce4 4800Ti. Ryzen XX60 CPUs have always been entry-level, Ryzen 5 1600, Ryzen 5 2600, Ryzen 5 3600 and now Ryzen 5 5600X. Again, if Intel had done anything like that, people would have torn them apart and they had the performance crown for more than a decade.


Uh, what? No, none of those have ever been entry level CPUs. They are, and have always been, mid-range parts. In the Ryzen 1000-series there were the 1200 and 1300X. In the 2000 series there was the 2300X, with 2000-series APUs (2200G and 2400G) filling out the entry level offerings. In the 3000 series you had the same again, with the 3100 and 3300X, alongside the 3200G and 3400G. You seem to be conflating _which parts are interesting to enthusiasts_ with _what their product tier is_. This is simply not true. xx60(X) Ryzen CPUs are very much mid-range parts. (I'd say the xx60 non-X is lower mid-range, xx60X is just plain mid-range, and the xx70 is upper mid-range.)



birdie said:


> The Core i3 6100, much faster in single-threaded mode than anything from AMD at that time was sold for $117.
> The Core i3 4130 before it, $122.


Relevance? One is a Skylake part, launched long before any Ryzen, and the other is Haswell, launched several years before that again.



birdie said:


> Why didn't Intel sell the Core i3 6100 for $183? It was the fastest entry level CPU at that time!


... it wasn't. That was the i3-6300. Besides that, due to the massive increase in core counts in the past 3 or so years the range and tiering of CPUs has obviously changed - hence why we no longer have $400 4c8t CPUs, as I said. All your examples here illustrate is just how terrible value the i7-6700K and similar chips were even in their time.



birdie said:


> Double effing standards and hypocrisy from AMD fans all the effing time even when their idol starts ripping off (Ryzen 5 3600 $200, Ryzen 5 5600 with the same number of cores $300).


What double standard? You're comparing different lineups and saying they are the same. There have been xx60X SKUs in all Ryzen series so far. There will in all likelihood be in the 5600 series, but it has as of yet not been launched. I would be _shocked_ if there wasn't a ~$230 5600 non-X launched in a few months - and if that turned out to be true, I would indeed be rather pissed. You are allowing your apparent bias against AMD to make you _preemptively angry_ long before the full 5000 series lineup is launched; you are _creating a reason for yourself to be mad_. Please stop.



birdie said:


> Even the most evil company in the world, Intel, didn't allow itself to do that as indicated earlier. F it and I'm out.


Allow itself to do what? To increase their prices as they got into a better competitive positioning? Given that Intel has been dominant in CPUs since at least 2006 I really don't have the data available to comment on that. But I don't know about you, but to me, bribing OEMs to use your product instead of a competitor's is ... ever so slightly worse than increasing prices a little. Just a tad, you know?



birdie said:


> Speaking of monopolies. Yes, AMD is playing like a monopoly. They've got the highest performance and they've started dictating prices which indicate they have no competition. Again, refer to my example at the beginning of the post: Intel did not allow itself to increase prices between generations for similar products, except when they started to offer significantly more cores. AMD has increased the price of their entry level CPU by whopping 50%, not $50 you keep mentioning.


First off: acting like a monopolist does not make you one. Secondly, there are many reasons beyond being a monopolist for increasing prices - such as delivering a superior product. Or are you saying BMW and Mercedes are monopolists because they price their cars higher than Toyota and Honda? Intel didn't need to increase prices because they had already created a market situation where they were selling dirt-cheap CPUs for $400 and calling them high-end. AMD tore down that system, and now you're somehow complaining that in late 2020 you can get a $300 6c12t CPU that at 65W TDP/88W max power draw boosts to 4.6GHz with significantly higher IPC than competing solutions? I mean, the 3600 was a _fantastic_ value CPU, but the 5600X is promising to be noticeably faster - it's clocked higher _and_ has much higher IPC, after all. It's still not as great value as the 3600 was, but it's not bad - and again, _there will in all likelihood be a 5600 non-X_.



birdie said:


> I'm ignoring your posts from now on. You've failed to address the fact that Intel doesn't allow itself to raise prices when they release faster better products. You're trying to compare the 5600X to the 3600X which wasn't the entry level CPU, it was the 3600 which cost $200, so the difference is not $50 but $100, i.e. whopping 1.5 times. Good luck with AMD a-licking and vindicating their monopolistic behavior (because it is what is is). What's bad for Intel and NVIDIA, is totally OK for AMD. I get it, now I have nothing else to talk with you about.


Dude, you need to calm down. You have created an entirely arbitrary definition of "entry level" that you are then using to whine about a situation that _isn't real_. Is the 5600X more expensive than the 3600? Yes. Is it also much faster? Yes. Is it in the same product tier? _No_, that would be the (likely ~$230) 5600 or the ($249) 3600X. Is it going to be the entry level Ryzen 5000 chip going forward? Not in any way, shape or form. Launching higher end parts first, and filling out the midrange and lower end later is entirely standard industry practice. AMD does it, Intel does it, Nvidia does it, and there is nothing inherently problematic with this.


----------



## Zach_01 (Oct 24, 2020)

birdie said:


> I'm ignoring your posts from now on. You've failed to address the fact that Intel doesn't allow itself to raise prices when they release faster better products. You're trying to compare the 5600X to the 3600X which wasn't the entry level CPU, it was the 3600 which cost $200, so the difference is not $50 but $100, i.e. whopping 1.5 times. Good luck with AMD a-licking and vindicating their monopolistic behavior (because it is what is is).


Yeah, ignore the fact that entry level of 3000series is 3100 and 3300... You can close your eyes all you want. Still exist. Still there.
Are you playing blind?
A little help? Are you ignoring other things also except me?


Zach_01 said:


> *And the 1200/1300/1400/1500/3100/3300 what exactly are? Sub-entry level or non existent CPUs?*
> You can cry all you want. 5600X is replacing 3600X and has price bump of 50$. That is a +20% on MSRP with at least the same performance uplift and most probably performs faster than any 6core.


You can try all you want but 5600X is replacing 3600X, you like it or not, understand it or not. I highly doubt the you dont, but just trying to distort reality for whatever reason. Couldnt care less why...

EDIT: typo


----------



## birdie (Oct 24, 2020)

Valantar said:


> Relevance? One is a Skylake part, launched long before any Ryzen, and the other is Haswell, launched several years before that again.



No effing relevance, AMD releases a new generation of the same product (More cores? No! Some new features, maybe the AVX512 instruction set, or DL instructions? No! Maybe integrated graphics for a change? No!) at a whopping 1.5 times higher price.



Valantar said:


> I mean, the 3600 was a _fantastic_ value CPU, but the 5600X is promising to be noticeably faster - it's clocked higher _and_ has much higher IPC, after all. It's still not as great value as the 3600 was, but it's not bad - and again, _there will in all likelihood be a 5600 non-X_.



Has Intel ever increased their prices 1.5 fold when their new CPUs offered fantastic increases in IPC and frequency? Should I remind you of Sandy Bridge which featured both these great advances vs. Clarkdale before it?



Valantar said:


> What double standard? You're comparing different lineups and saying they are the same. There was no X SKU in the 3600 series. There was in the 1600 and 2600 series. There will in all likelihood be in the 5600 series, but it has as of yet not been launched. I would be _shocked_ if there wasn't a ~$230 5600 non-X launched in a few months - and if that turned out to be true, I would indeed be rather pissed. You are allowing your apparent bias against AMD to make you _preemptively angry_ long before the full 5000 series lineup is launched; you are _creating a reason for yourself to be mad_. Please stop.



AMD has deliberately chosen not to release the Ryzen 5 5600 to increase their margins - exactly the thing which Intel and NVIDIA are hated for. I don't care if they release the 5600 at a later time, I'm talking about their initial lineup. You and no one else know if 5600 will ever get released. Period. Intel and NVIDIA get ripped apart when they release new generations of products at a higher price. AMD now does exactly the same and all I see is tons of lame excuses to make it look OK'ish. God, I'm tired of this stupid conversation. F it and I'm done with it.



Valantar said:


> First off: acting like a monopolist does not make you one.



Wow. I'm out of words. Never seen logic being skewed and twisted like this time around. Another person whose messages I will no longer see. For the 20th, I don't understand why this behavior is condoned while Intel and NVIDIA are destroyed for it.


----------



## kapone32 (Oct 24, 2020)

If AMD had launched these X Skus at the price of the already existing 3000 weries skus they would have to have prematurely drop the price of those parts. As much as AMD really gained in mind share and revenue they have only been financially solvent for about 1 year. As much as people would like to complain about their tactics being the same as Intel if you already own a B450 to X570 board you just need a BIOS update to make it work. So much anger for such a small issue. If you don't want to pay $299 US for the 5600X just wait until a chip comes out that you want. This is the last iteration of AM4 (as far as we know) so expect that it there will be lot's of choices for all of us including the upcoming 4000 series APUs.


----------



## Valantar (Oct 24, 2020)

birdie said:


> No effing relevance, AMD releases a new generation of the same product (More cores? No! Some new features, maybe the AVX512 instruction set, or DL instructions? No! Maybe integrated graphics for a change? No!) at a whopping 1.5 times higher price.
> 
> Has Intel ever increased their prices 1.5 fold when their new CPUs offered fantastic increases in IPC and frequency? Should I remind you of Sandy Bridge which featured both these great advances vs. Clarkdale before it?


Okay, here's some basic math:
Ryzen 3600X: $249
Ryzen 5600X: $299.
299/249 = 1.2008
1.2008 ≠ 1.5

For what seems to be the hundredth time: _the 5600X is not in the same tier as the 3600_.



birdie said:


> AMD has deliberately chosen not to release the Ryzen 5 5600 to increase their margins - exactly the thing which Intel and NVIDIA are hated for. I don't care if they release the 5600 at a later time, I'm talking about their initial lineup. You and no one else know if 5600 will ever get released. Period. Intel and NVIDIA get ripped apart when they release new generations of products at a higher price. AMD now does exactly the same and all I see is tons of lame excuses to make it look OK'ish. God, I'm tired of this stupid conversation. F it and I'm done with it.


Sure, part of why there isn't yet a 5600 is that while we wait for it, some people will get impatient and buy the X instead, increasing AMD's margins a tad and helping them amortize the R&D cost for the new architecture faster with higher ASP parts. For the rest of us, we can wait a bit, not get stressed out over stupid stuff, and get a non-X when it launches.

As for this double standard you're speaking of: I don't believe I've ever criticized either Intel or Nvidia for launching high end parts first and filling out the rest of the lineup later. As I said, it's standard industry practice, and not inherently problematic (partly due to the amortization of R&D costs I mentioned above).

And as for whether a 5600 non-X will be released: as I've gone into at length above, there have _always_ been both X and non-X SKUs in the Ryzen 5 xx60 series. There is no reasonable reason for there not to be this time around, but you are for some reason expecting there not to be. You are the one expecting a change from the norm here, the one making a new assumption, and as such the burden of proof is on you. As for my reaction if there, against all odds, is no non-X coming, I've already gone into that too:


Valantar said:


> I would be _shocked_ if there wasn't a ~$230 5600 non-X launched in a few months - and if that turned out to be true, I would indeed be rather pissed.



If you actually _read_ the posts of the people you were arguing against we would be having a much more productive discussion here.


birdie said:


> Wow. I'm out of words. Never seen logic being skewed and twisted like this time around. Another person whose messages I will no longer see. For the 20th, I don't understand why this behavior is condoned while Intel and NVIDIA are destroyed for it.


To be a monopolist, you need to be in a position where you can have a monopoly. Intel still outsells AMD by at least 3-4X. That is hardly a monopoly, right? As for you entering into a discussion, making a bunch of sensationalist claims with no backing, and then lashing out at and ignoring the people arguing against you, well ... that's on you.


----------



## kapone32 (Oct 24, 2020)

Valantar said:


> Okay, here's some basic math:
> Ryzen 3600X: $249
> Ryzen 5600X: $299.
> 299/249 = 1.2008
> ...


----------



## Zach_01 (Oct 24, 2020)

Valantar said:


> As for you entering into a discussion, making a bunch of sensationalist claims with no backing, and then lashing out at and ignoring the people arguing against you, well ... that's on you.


He entered this with intention to do more than that, but I highly doubt he can...



birdie said:


> I know AMD fans are super excited but I'd love to shit on your parade.


I guess this is productive... in a way...


----------



## crimsontape (Oct 24, 2020)

I come here for the comments. I wasn't disappointed. You are all awesome.

No one like price hikes, but I'm sure this isn't all money in the pocket for AMD. I wonder if they bought IP to move their architecture forward. That would explain a hike. Might be the same IP that's finding itself in Zen2 and new consoles... Possibly a deferred cost model? Simple supply and demand for their CPUs is another factor. They have the product and increased market share to show for it now, no?

This isn't monopoly. This is capitalization. And they're doing it very well. They're not fixing the market and choke-holding us to buy their products. Intel arguably has become that, but only thanks to the same approach: capitalize. Which, for them, I believe was mostly a question of volume of supply (which is only shaky in the last few years), and establishing ISAs which other vendors then pay to implement and use. More IP.

There's only guilt when you fix/manipulate markets, like price-fixing (as ATI and NVIDIA once did some 15 years ago?) and purposefully breaking an existing market solution to artificially encourage the adoption of a bad product, for which you're the sole provider (electric cars, intel MMX).


----------



## Mats (Oct 24, 2020)

birdie said:


> Has Intel ever increased their prices 1.5 fold when their new CPUs offered fantastic increases in IPC and frequency?


Neither have AMD. 36 % isn't 50 %. 


```
3600X    249        5600X    299    20 % higher
3700X    329        5800X    449    36 % 
3800X    399        5800X    449    12.5 %
3900X    499        5900X    549    10 %
3950X    749        5950X    799    6.7 %
```

Yeah we all miss the successors of the 3600 and the 3700X, but if prices seem a bit high, just wait.
More models will most likely show up, prices will go down, and Rocket Lake will probably affect prices in some way.
Stop comparing the 3700K to the 5800X, as we will most likely see some product in between there at some point.
The successor to the 3800X is the 5800X, obviously.



birdie said:


> Should I remind you of Sandy Bridge which featured both these great advances vs. Clarkdale before it?


You're the one who has to be reminded before you make yet another bitter post about not having the money for Ryzen 5000. Give it a rest already.

Clarkdale was a low end dual core.
Sandy Bridge was the successor to Lynnfield, the latter had lower clock speeds but still a very capable CPU. Both launched at over 300USD.

Now stop being bitter, we don't care.


----------



## RandallFlagg (Oct 24, 2020)

birdie said:


> I know AMD fans are super excited but I'd love to shit on your parade. You know why? AMD has finally outperformed an Intel uArch from ... 2015. This might sound like a great achievement but honestly it's just because Intel has completely f*ed up their 10nm transition. Yeah, their latest 10nm++ node (first - Cannon Lake, second - Ice Lake and now Tiger Lake) allows to boost to 4.8GHz at the expense of insane power consumption and they've made changes to the Willow Cove Core architecture which sometimes translate to a lower performance than Ice Lake:
> 
> 
> 
> ...




What is funny here is how hard you had to look to cherry pick a benchmark that showed AMD over Tiger Lake.  Right below that same image you posted, is the one below, where a 15W limited Tiger Lake is nearly twice as fast as the 4750G Pro  (using AVX) - and who in their right mind would disable this if they are doing this type of thing (which, most won't be doing this) :






And then there are all the benchmarks of real-world things people do with their laptops that you ignored, and where Tiger Lake completely rips apart AMD. 

In fact, Tiger Lake as a laptop chip is as fast and quite often faster than the 10900K in many instances.  For example, the Kraken 1.1 scores of 630/631 are better than the previous best score for *any* system - 730ms scored by a 10900K. Tiger Lake is hitting 630/631ms. That is a huge difference.

If any significant part of this gets in to Rocket Lake, Zen 3 won't be on top for very long.  

Web Browsing :













Rendering :





Play a MMO :





Play an FPS :


----------



## Mats (Oct 24, 2020)

Tiger Lake is very capable. Overall, it has a faster GPU but a slower CPU compared to Renoir, whit the latter being no surprise given it only has 4 cores. 
Even a 6 core TL would possibly come close to Renoirs CPU performance.









						Asus ZenBook 14 UX425E Review: 11th Gen Core i7 "Tiger Lake" Debut
					

Asus will be one of the first major OEMs to launch a family of Tiger Lake laptop this October. We test the upcoming ZenBook 14 UX425EA equipped with the Core i7-1165G7 to see how it compares to the rising popularity of AMD Ryzen alternatives.




					www.notebookcheck.net
				




Now I don't know why Intel still can't make 10 nm and 8 cores. Low yields with Cannon Lake and Ice Lake, I get that, but even to this day? Intel's 10 nm is a train wreck that keeps on pushin'.  The architecture seems decent tho.


----------



## RandallFlagg (Oct 24, 2020)

Mats said:


> Tiger Lake is very capable. Overall, it has a faster GPU but a slower CPU compared to Renoir, whit the latter being no surprise given it only has 4 cores.
> Even a 6 core TL would possibly come close to Renoirs CPU performance.
> 
> 
> ...



Tiger Lake does not have a slower CPU, it has less cores.  Look at the benchmarks.  

This is the R15 ST Tiger Lake vs 4800U - Tiger Lake is 27.1% faster here.   :







Keep in mind Tiger is an Intel laptop chip, I'm now comparing to full on desktop chips -

This is vs the R5 3600 desktop chip, Tiger is 18.5% faster :





This is vs the R7 3800XT - the fastest AMD desktop chip here, Tiger is 8% faster:





And here, the fastest desktop chip you can buy for this ST benchmark the 10900K, is beaten by a Tiger Lake laptop chip:


----------



## zlobby (Oct 24, 2020)

Dredi said:


> Not really relevant to the news article, but I wonder how long it’ll take for userbenchmark to change its point system (again) after the launch? Will it become just a memory latency test? XD


Hey, as long as intel win, it's OK.

We saw a long time ago that money overpower dignity.


----------



## Mats (Oct 24, 2020)

RandallFlagg said:


> Tiger Lake does not have a slower CPU, it has less cores.  Look at the benchmarks.


Well, that's pretty much what I said. 


> Overall, it has a faster GPU but a slower CPU compared to Renoir, with the latter being no surprise given it only has 4 cores.





> Even a 6 core TL would possibly come close to Renoirs CPU performance.



Please read before you reply.
Also mobile vs desktop doesn't make much difference when talking ST, since power limit becomes less of a limiting factor.






						AMD Ryzen 7 PRO 4750G vs. AMD Ryzen 7 4800U - Benchmark, Test and Specs
					

AMD Ryzen 7 PRO 4750G vs. AMD Ryzen 7 4800U - Benchmark, Geekbench 5, Cinebench R20, Cinebench R23, Cinebench R15 and FP32 iGPU (GFLOPS) benchmark results




					www.cpu-monkey.com


----------



## RandallFlagg (Oct 24, 2020)

Mats said:


> Well, that's pretty much what I said.
> 
> 
> 
> ...



I did read it, but saying the "CPU is slower" is false, fewer cores is not the same thing as having a slower CPU.  That depends entirely on what you're doing, and most things are still single thread limited.   Just open up task manager and watch as you do things, you'll go single core limited long before you go multi-core limited the vast majority of the time.  Then there is the spectre of 6 and 8 core Tiger Lake in Q1.  Then if you get Rocket Lake multi-core high freq and high power limits, the picture becomes quite a bit clearer.

That 2nd comment about power doesn't make sense.  The 3800XT is 26% faster than the 4800U in single thread - the difference is entirely due to power limits since these are the same architecture chips.  Diminishing returns, sure, but 26% is nothing to sneeze at.  Yet that 105W desktop TDP part still loses to a 28W Tiger Lake part by 8%.  

The per core performance difference between laptop Tiger Lake and the laptop Ryzen Renoir and desktop Ryzen Zen 2 parts is massive.  For that matter the difference between Tiger Lake and Comet Lake is massive as well.


----------



## Mats (Oct 24, 2020)

RandallFlagg said:


> fewer cores is not the same thing as having a slower CPU.


Nobody said so. TL is slower in many benchmarks, mostly because it only has 4 cores.
I did edit my last post so maybe you didn't get the comparison of the 15 W 4800U and the 65 W 4750G, both Renoir. The 15 W part barely runs slower in ST than the 65 W.
Judging by your logic, the 4800U would crush the 4750G if it could run at 65 W, which is not the case.


----------



## TheoneandonlyMrK (Oct 24, 2020)

Should be a new version out shortly to correct this great wrong, userbench clearly will need a new version too soon.


----------



## RandallFlagg (Oct 24, 2020)

Mats said:


> Nobody said so. TL is slower in many benchmarks, mostly because it only has 4 cores.
> I did edit my last post so maybe you didn't get the comparison of the 15 W 4800U and the 65 W 4750G, both Renoir. The 15 W part barely runs slower in ST than the 65 W.
> Judging by your logic, the 4800U would crush the 4750G if it could run at 65 W, which is not the case.



You're making a false comparison and putting words in my mouth.

The 4800U is in fact crushed by the 3800XT by 26%.  I just said that.  If desktop variants of the tiger lake architecture gets a similar differential, it would crush a 10900K and a 3800XT by around 30% in single core.


----------



## Mats (Oct 24, 2020)

RandallFlagg said:


> You're making a false comparison and putting words in my mouth.


You're the one who said that Tiger Lake is held beck in ST by being mobile, I said no, and showed you a direct comparison with desktop and mobile Renoir, an indication that power limit has nothing or very little to do with it.

What you say is holding back mobile TL must hold back mobile Renoir as well? That's your logic, and don't feel hurt that I carried it over to a different comparison, no offence was intended.



RandallFlagg said:


> The 4800U is in fact crushed by the 3800XT by 26%.


So why not compare the 4800U with its desktop counterpart? By doing so we see exactly what happens when we go from mobile to desktop, ruling out a few differences.


----------



## Valantar (Oct 24, 2020)

RandallFlagg said:


> That 2nd comment about power doesn't make sense. The 3800XT is 26% faster than the 4800U in single thread - the difference is entirely due to power limits since these are the same architecture chips. Diminishing returns, sure, but 26% is nothing to sneeze at. Yet that 105W desktop TDP part still loses to a 28W Tiger Lake part by 8%.


There are significant differences between how Intel and AMD treat power limits and turbo though, both on desktop and mobile. Mobile Intel parts tend to boost to higher power levels than mobile AMD parts, and IIRC single core boost power for TGL is significantly higher than for Renoir, so that at least partially explains the delta. TGL is still very fast obviously, and faster than Zen 2 in both IPC and boost clocks (including mobile TGL vs desktop Zen2). Zen 3 should surpass it though:
SKL (and derivatives): 100% IPC
Zen 2: ~107% IPC (though that doesn't include latency-sensitive applications like games, number is based on AT's SPEC numbers)
ICL (and TGL, as there shouldn't be any IPC change) : ~118% IPC
Zen 3: ~107*1.19= 127.33% IPC. So even with TGL clocking slightly higher Zen 3 should have the upper hand, at worst they'll tie.


----------



## Mats (Oct 24, 2020)

About Tiger Lake 1165G7 being fast, I dunno anymore. I guess I didn't read up enough in my own link. Shame on me.


----------



## Makaveli (Oct 24, 2020)

birdie said:


> I know AMD fans are super excited but I'd love to shit on your parade. You know why? AMD has finally outperformed an Intel uArch from ... 2015. This might sound like a great achievement but honestly it's just because Intel has completely f*ed up their 10nm transition. Yeah, their latest 10nm++ node (first - Cannon Lake, second - Ice Lake and now Tiger Lake) allows to boost to 4.8GHz at the expense of insane power consumption and they've made changes to the Willow Cove Core architecture which sometimes translate to a lower performance than Ice Lake:
> 
> 
> 
> ...



lol you came in here to Intel Fan boy troll and then got destroyed by logic.

Is the intel defense Fund not paying you enough?

i've seen post like this on numbers forums from fan boys you really can't take heat AMD is putting out, so lets go and try to rain on peoples parade to make myself feel better... Get a life!


----------



## Th3pwn3r (Oct 24, 2020)

Metroid said:


> I dont know what lisa is doing but she is doing it right, new management, new plan, new thinking, new products, new records, new highs, new performance. When lisa got in, amd was so far behind intel and nvidia that only a miracle would turn the tides. Good work lisa.
> 
> I hope msi releases a bios for my b450 before january 2021 so i can purchase that 5600x or the 5900x.




It's more along the lines of what they're not doing. They're not milking ancient tech and expecting consumers to be happy like Intel has been. In my opinion AMD has advanced MOSTLY because they're pushing further in the NM process all while Intel sat on their ass. It was the classic story of the Tortoise And The Hare. The hare slept way too long and is now in a panic.


----------



## RandallFlagg (Oct 24, 2020)

Th3pwn3r said:


> It's more along the lines of what they're not doing. They're not milking ancient tech and expecting consumers to be happy like Intel has been. In my opinion AMD has advanced MOSTLY because they're pushing further in the NM process all while Intel sat on their ass. It was the classic story of the Tortoise And The Hare. The hare slept way too long and is now in a panic.



AMD doesn’t push ‘nm’, that would be TSMC.  AMD is just reaping the rewards of TSMCs leadership in that area.


----------



## Valantar (Oct 24, 2020)

Mats said:


> About Tiger Lake 1165G7 being fast, I dunno anymore. I guess I didn't read up enough in my own link. Shame on me.
> View attachment 173182


That looks like it's from a NotebookCheck review. Care to link it? Of course absolute performance in mobile depends a lot on the implementation (chassis, cooling, VRM, fan curves, etc.), but at least on paper TGL _should_ boost higher than ICL in the short term and sustain slightly higher base clocks in the long term.



Th3pwn3r said:


> It's more along the lines of what they're not doing. They're not milking ancient tech and expecting consumers to be happy like Intel has been. In my opinion AMD has advanced MOSTLY because they're pushing further in the NM process all while Intel sat on their ass. It was the classic story of the Tortoise And The Hare. The hare slept way too long and is now in a panic.


Don't underestimate the value of the architectural development they've done. Remember, Zen (1) represented a >50% IPC improvement over ... ugh, Excavator? I could never tell all that heavy machinery apart. Never mind, Zen was a _massive_ improvement in both performance/clock and performance/W _even before accounting for the node improvement_. That was just icing on the cake. And they've kept iterating on it rapidly, with yearly improvements (though admittedly Zen+ wasn't much to write home about). All the while Intel has just last year gotten around to actually improving their architecture's performance characteristics. Though it's also well worth mentioning that _before_ things stagnated for Intel, we were seeing 5-10% IPC improvements per generation from them. ICL and TGL are ~18% up from SKL, so that's much better, but it took five years. In the mean time, AMD has delivered +~3%, +~15% and +~19% and on a much tighter schedule.


----------



## TheUn4seen (Oct 24, 2020)

Nice.... if true. It's not like AMD ever used untrue claims and "fanboy fantasy" marketing to sell products which turned out to be a huge disappointment, right?.
But again, great if true.


----------



## thesmokingman (Oct 24, 2020)

I see AMD has finally done it. They've gotten the deniers all in a defensive frenzy. They didn't freak out when it was the 5950x, but no, a 5600x. Cannot have that now! lmao


----------



## Valantar (Oct 24, 2020)

TheUn4seen said:


> Nice.... if true. It's not like AMD ever used untrue claims and "fanboy fantasy" marketing to sell products which turned out to be a huge disappointment, right?.
> But again, great if true.


Well, this is a Passmark result, not AMD marketing. The result may well be submitted by someone inside AMD with the purpose of having it "leak", but unless they are either LN2 overclocking the chip or outright faking the result, there isn't much they could do to make it look better than it is. Traditional overclocking on Ryzen has so far meant _losing_ ST performance in favor of MT, after all.


----------



## PooPipeBoy (Oct 24, 2020)

thesmokingman said:


> I see AMD has finally done it. They've gotten the deniers all in a defensive frenzy. They didn't freak out when it was the 5950x, but no, a 5600x. Cannot have that now! lmao



AMD released more budget-oriented processors at the Zen 2 release (3600, 3600X, 3700X, 3800X, 3900X), but this time around they've focussed more on releasing the higher-end processors with Zen 3 (5600X, 5800X, 5900X, 5950X).

The reality is that AMD hasn't even released any bang-for-buck processor SKUs yet, so the accusations of 50% price increases and price gouging tactics are highly exaggerated.


----------



## Jism (Oct 25, 2020)

oxrufiioxo said:


> When using the default intel spec for power limits its pretty shitty only slightly faster than a 2700X in MT workloads....... Gaming isn't Affected though. Not sure about 21% though either but most Reviews reviewed the chip with all limits removed.



And is'nt having the chip locked to a 65W TDP spec vs a AMD at 65W spec single core, a good sign? At the end of the day it's how the chip performs. And if this is true your looking at a overall winner coming (finally) from team AMD.


----------



## TheUn4seen (Oct 25, 2020)

Valantar said:


> Well, this is a Passmark result, not AMD marketing. The result may well be submitted by someone inside AMD with the purpose of having it "leak", but unless they are either LN2 overclocking the chip or outright faking the result, there isn't much they could do to make it look better than it is. Traditional overclocking on Ryzen has so far meant _losing_ ST performance in favor of MT, after all.


The thing is that I wouldn't put faking the result above what this particular company is willing to do in an attempt to generate hype. They were willing to outright lie to their investors and clients not that long ago, so... meh, I'll wait for the actual third party reviews of the actual end-user product before even considering a purchase.


----------



## thesmokingman (Oct 25, 2020)

TheUn4seen said:


> The thing is that I wouldn't put faking the result above what this particular company is willing to do in an attempt to generate hype. They were willing to outright lie to their investors and clients not that long ago, so... meh, I'll wait for the actual third party reviews of the actual end-user product before even considering a purchase.



You might want to look in the own mirror. It's ironic that you suggest that diarrhea given that Intel is being sued by their investors for fraud.


----------



## TheUn4seen (Oct 25, 2020)

thesmokingman said:


> You might want to look in the own mirror. It's ironic that you suggest that diarrhea given that Intel is being sued by their investors for fraud.


To be honest, I have nothing but contempt for all corporations and, by extension, their fanboys, irrespective of the market segment they represent. I buy a product for what it is, after it is available on the market, gets reviewed by several sources and has proven itself. AMD products are routinely overhyped before release with outlandish claims which almost every time turn out to be worth their weight in fecal material. I'm not saying Intel is any better, no corporation is, but AMD is the corporation which often used social media and hype to aggressively spread outright lies about their upcoming products.


----------



## thesmokingman (Oct 25, 2020)

TheUn4seen said:


> To be honest, I have nothing but contempt for all corporations and, by extension, their fanboys, irrespective of the market segment they represent. I buy a product for what it is, after it is available on the market, gets reviewed by several sources and has proven itself. AMD products are routinely overhyped before release with outlandish claims which almost every time turn out to be worth their weight in fecal material. I'm not saying Intel is any better, no corporation is, but AMD is the corporation which often used social media and hype to aggressively spread outright lies about their upcoming products.



It seems to me that you are focused only on AMD when Intel keeps lying to everyone and is actually being sued by their investors right now for that. Under Su, AMD have delivered on these so-called outlandish claims.  But yet you don't bat for any team... ok.


----------



## efikkan (Oct 25, 2020)

Jism said:


> And is'nt having the chip locked to a 65W TDP spec vs a AMD at 65W spec single core, a good sign? At the end of the day it's how the chip performs. And if this is true your looking at a overall winner coming (finally) from team AMD.


I don't think we can compare it that way. Intel's TDP means the CPU will throttle to that power usage after a few seconds (typical 28 or 56 seconds), while AMD's TDP isn't a hard limit at all, it's more of a cooling guide. Real benchmarks will show how well it performs, but it will probably be good.


----------



## Zach_01 (Oct 25, 2020)

3600nonX 65W TDP

R15/20 SMT = 86~88W PPT (Power Package Tracking) = Max power draw limit (sustained if temp/current within limits)
R15/20 ST = 50~55W PPT

AMD’s TDP rate is for required cooling capacity on specific CPU/case/ambient temperatures.


----------



## Mats (Oct 25, 2020)

Valantar said:


> That looks like it's from a NotebookCheck review. Care to link it? Of course absolute performance in mobile depends a lot on the implementation (chassis, cooling, VRM, fan curves, etc.), but at least on paper TGL _should_ boost higher than ICL in the short term and sustain slightly higher base clocks in the long term.


Yup. https://www.techpowerup.com/forums/...ark-single-thread-results.273696/post-4375982

Yeah, it's all about implementation when it comes to laptops, like when the low end IL i5 beats a "faster" TL i7 in R20 MT.




But please be aware that Tiger Lake is not slower than the 4800U, it only has _Fewer Cores™_. 
(Not directed at you Valantar)



TheUn4seen said:


> It's not like AMD ever used untrue claims and "fanboy fantasy" marketing to sell products which turned out to be a huge disappointment, right?.


Maybe, but it doesn't matter here. AMD doesn't own Passmark, and they didn't post this, Videocardzzzzzzz did. It's a screenshot from Passmark.

*Why do people still think that AMD is behind this?* 

Is it because the OP mixed AMD slides with the Passmark screenshot?


----------



## Valantar (Oct 25, 2020)

TheUn4seen said:


> To be honest, I have nothing but contempt for all corporations and, by extension, their fanboys, irrespective of the market segment they represent. I buy a product for what it is, after it is available on the market, gets reviewed by several sources and has proven itself. AMD products are routinely overhyped before release with outlandish claims which almost every time turn out to be worth their weight in fecal material. I'm not saying Intel is any better, no corporation is, but AMD is the corporation which often used social media and hype to aggressively spread outright lies about their upcoming products.


You're definitely right that AMD has a history of really shitty marketing, but I have two notes on that:
- The truly shitty marketing has been limited to their graphics cards, and almost entirely to RTG under Raja Koduri. Not saying it was his doing, but we haven't seen anything like the Vega launch and "Poor Volta" since.
- Lisa Su does seem to be on a "sell products on their own merits" rampage through the various PR departments at AMD. She clearly didn't have much say with RTG under Koduri (his desire for near total independence is widely documented), but my impression is that things have been changing since he left. PR wise the RDNA launch was a pretty straightforward affair, after all.


----------



## RandallFlagg (Oct 26, 2020)

thesmokingman said:


> You might want to look in the own mirror. It's ironic that you suggest that diarrhea given that Intel is being sued by their investors for fraud.



You might want to go look at what AMD said in the past, and how that compared to reality.

But since you're not gonna do that, I'll show you.  AMD doesn't exactly lie on these, but they're not going to show you everything either.

AMD announcement, 3600X gaming performance - pretty much a tie right?  :







Reality (note the 9600k is over 10% faster in their geometric mean).  This is where you'll see significantly more stutter less consistent performance from a 3600X than a 9600K.  Note that at the time, there was only a $13 price difference here  :





And even in averages, the 9600K was better.  Even my lowly and less expensive 10400 whips the new and more expensive 3600XT by almost 5% in average FPS.  << - That's reality.  And the 10600K?  We're talking 7.4% faster average FPS than a 3600XT.


----------



## Zach_01 (Oct 26, 2020)

RandallFlagg said:


> You might want to go look at what AMD said in the past, and how that compared to reality.
> 
> But since you're not gonna do that, I'll show you.  AMD doesn't exactly lie on these, but they're not going to show you everything either.
> 
> ...


Can you help me with my math please? Where exactly are you seeing the 10% difference? Because I only see a 2%... (TPU)
And in Geomean the 3600X is infront...


----------



## Valantar (Oct 26, 2020)

RandallFlagg said:


> You might want to go look at what AMD said in the past, and how that compared to reality.
> 
> But since you're not gonna do that, I'll show you.  AMD doesn't exactly lie on these, but they're not going to show you everything either.
> 
> ...


Uhm, did you miss the fact that the top CPU in that chart from Tom's is OC'd to 5GHz, and that the stock 9600K is essentially tied with, but slightly lower than the 3600X?


----------



## RandallFlagg (Oct 26, 2020)

Zach_01 said:


> Can you help me with my math please? Where exactly are you seeing the 10% difference? Because I only see a 2%... (TPU)
> And in Geomean the 3600X is infront...



It's actually 2.7% - closer to 3% than 2% - in the averages.  The 9600K OCs to 5.0 pretty much 100% of the time, most are getting 5.1.  That's a 13% difference actually.  For $13.  Which would explain why the 3600X never actually commanded that price.


----------



## Zach_01 (Oct 26, 2020)

Ohh oh OC...
Yeah... because this is how any corporation markets a CPU. By telling users that they can push power draws to 200W levels and gain a hefty performance of 10%.

Good to know!

And the difference is 2.3%...


----------



## RandallFlagg (Oct 26, 2020)

Zach_01 said:


> Ohh oh OC...
> Yeah... because this is how any corporation markets a CPU. By telling users that they can push power draws to 200W levels and gain a hefty performance of 10%.
> 
> Good to know!
> ...




The 3600X is 99.6 and the 9600K is 102.3  

102.3/99.6 X 100% = 102.71%   

You are looking at the 3600XT which is running about the same price as the 9700K right now.  Edit: Assuming you can find a 3600XT.


----------



## Zach_01 (Oct 26, 2020)

RandallFlagg said:


> The 3600X is 99.6 and the 9600K is 102.3
> 
> 102.3/99.6 X 100% = 102.71%
> 
> You are looking at the 3600XT which is running about the same price as the 9700K right now.  Edit: Assuming you can find a 3600XT.


You're right I did see the XT.
I stand corrected!

Have your +0.4% of performance along with that +10% of 200W.


----------



## Valantar (Oct 26, 2020)

RandallFlagg said:


> It's actually 2.7% - closer to 3% than 2% - in the averages.  The 9600K OCs to 5.0 pretty much 100% of the time, most are getting 5.1.  That's a 13% difference actually.  For $13.  Which would explain why the 3600X never actually commanded that price.


Yet the _vast_ majority of users have no idea how to overclock, no interest in learning to do so, and run their parts at stock. Heck, most non-enthusiast computer builders don't even know they have to activate XMP manually. What is possible and what is reality for most people are not the same thing.


----------



## RandallFlagg (Oct 26, 2020)

Zach_01 said:


> You're right I did see the XT.
> I stand corrected!
> 
> Have your +0.4% of performance along with that +10% of 200W.



It's 179W on torture test.  Note that is (far) less than a 2700X stock on the same test:


----------



## dir_d (Oct 26, 2020)

How can you talk about the performance of a 3600 then turn around and give power numbers for a 2700x, come on man...


----------



## Zach_01 (Oct 26, 2020)

RandallFlagg said:


> It's 179W on torture test.  Note that is (far) less than a 2700X stock on the same test:
> 
> View attachment 173421


I never said anything about OC, you keep bringing it up.
Here we are talking about the 5000 series CPUs and you have gone back 2 gens for both Intel and AMD talking about OC... *to proof what exactly?* How superior Intel Is?

Well have it...








						Raja Koduri to Present at Samsung Foundry Forum amid Intel's Outsourcing Efforts
					

Intel's chief architect and senior vice president of discrete graphics division, Mr. Raja Koduri, is said to be scheduled to present at Samsung Electronics Event day. With a presentation titled "1000X More Compute for AI by 2025", the event is called Samsung Foundry SAFE Forum. It is a global...




					www.techpowerup.com


----------



## Makaveli (Oct 26, 2020)

dir_d said:


> How can you talk about the performance of a 3600 then turn around and give power numbers for a 2700x, come on man...



Guy is reaching real hard with his posts.

Let me sum it up in one line.

"I prefer intel inside"


----------



## BoboOOZ (Oct 26, 2020)

You guys are being way too patient trying to reason with fanboys. Anyway, if you want a more historical perspective on how intel lies, here you go:










Rounding a few numbers on a presentation is really on the nice side.


----------



## RandallFlagg (Oct 26, 2020)

Zach_01 said:


> I never said anything about OC, you keep bringing it up.
> Here we are talking about the 5000 series CPUs and you have gone back 2 gens for both Intel and AMD talking about OC... *to proof what exactly?* How superior Intel Is?
> 
> Well have it...
> ...



If you go back and read what I was responding to - nevermind, i'll do that for you too.

It was a rebuttal to the veracity of AMDs performance claims.  In relation to Zen 3, that chart I posted about Zen 2 vs Gen 9, is pretty close to their current claims.  I seriously doubt you are going to see a 5600X unseat say a 10600K in gaming in the real world.   Match it - maybe.  Beat it in any significant way, not likely.


----------



## Deleted member 24505 (Oct 26, 2020)

Funny how only intel users are fanboys to amd users.

Sick of all this crap on TPU.

If this 5600x is as good as it seems, then i will probably switch to it. I have no care for either Intel/Amd or Nvidia/Radeon. I just buy what is best for what i can afford.


----------



## thesmokingman (Oct 26, 2020)

dir_d said:


> How can you talk about the performance of a 3600 then turn around and give power numbers for a 2700x, come on man...



You have to ignore the resident shill.


----------



## Makaveli (Oct 26, 2020)

tigger said:


> Funny how only intel users are fanboys to amd users.
> 
> Sick of all this crap on TPU.



Nothing funny about it. There are fan boys on both sides and its also not a TPU only issue.

this crap goes on all over the internet.

At the end of the day your allegiance should be to your wallet alone and buy what you like.


----------



## Zach_01 (Oct 26, 2020)

RandallFlagg said:


> If you go back and read what I was responding to - nevermind, i'll do that for you too.
> 
> It was a rebuttal to the veracity of AMDs performance claims.  In relation to Zen 3, that chart I posted about Zen 2 vs Gen 9, is pretty close to their current claims.  I seriously doubt you are going to see a 5600X unseat say a 10600K in gaming in the real world.   Match it - maybe.  Beat it in any significant way, not likely.


This is a different matter than OC capabilities. I understand, you have no faith to AMDs claims. Its your right. But bringing a slide that claims parity in 8 games and then a TPU chart that proof this wrong by 2.7% in 15 games is not actually a discovery nor that disproofs AMD's integrity. If we start this about marketing claims and perspectives that corporations try to present we will end by 2022. Intel has worst claims in the recent years. AMD seems to getting better their sh1ts together the last couple of years if not more. And in last ZEN3 anouncment they seemed more professional. They even showed games that traditionally lose.

I know and many know that synthetics do not tell the truth about actual real life performance. But tell me (or show) this... How much faster is the 10600k in gaming (1080p) than a 3600X?


----------



## RandallFlagg (Oct 26, 2020)

Zach_01 said:


> ...
> 
> I know and many know that synthetics do not tell the truth about actual real life performance. But tell me (or show) this... How much faster is the 10600k in gaming (1080p) than a 3600X?



It's in the chart I posted, 7.6% on average FPS at stock speed.   

To note, these differences become even more pronounced with new faster GPUs.  The 9900K had a 4.8% advantage over the 3900XT in TPUs comparison using the 2080 Ti. 

When TPU did the 3900XT vs 9900K vs 10900K comparison with a 3080, that difference went to a full 10%.  

It's clear Zen 2 was already limiting gaming performance, but Skylake+++ had more breathing room.  It will be interesting to see what happens when they put a Zen 3 up against Gen10 with one of these new GPUs.


----------



## Zach_01 (Oct 26, 2020)

RandallFlagg said:


> It's in the chart I posted, 7.6% on average FPS at stock speed.
> 
> To note, these differences become even more pronounced with new faster GPUs.  The 9900K had a 4.8% advantage over the 3900XT in TPUs comparison using the 2080 Ti.
> 
> ...


+7.6% difference for the 10600k. And if we apply the same 3080 rule (3900XT vs 10900k) that difference will go were? to 15%? maybe 16%?
And the 5600X with +20% IPC and and another 4.5% speed uplift on top of that wont make it past that 16%?
I know its not 1:1 for IPC+MHz against gaming % perf but still, will see.
And dont we start about the multi threaded perf... or the power draw... let it go.


----------



## RandallFlagg (Oct 26, 2020)

Zach_01 said:


> +7.6% difference for the 10600k. And if we apply the same 3080 rule (3900XT vs 10900k) that difference will go were? to 15%? maybe 16%?
> And the 5600X with +20% IPC and and another 4.5% speed uplift on top of that wont make it past that 16%?
> I know its not 1:1 for IPC+MHz against gaming % perf but still, will see.
> And dont we start about the multi threaded perf... or the power draw... let it go.



IPC is really almost meaningless, I've not seen any site effectively measure it as it is very use case specific.  For example, you choose to use Cinebench to measure IPC.  And what if I choose to user SuperPi, where a 4.3Ghz 10400 whacks a 4.5Ghz 3600XT to the tune of 8% better performance with 4% lower clocks?   That would mean the 10400 has 10-12% better IPC, now wouldn't it?   

Basically in the end, it all comes down to the benchmarks in the things you use.   Every time someone starts talking nm / IPC / Ghz - that's just meaningless drivel in the end.


----------



## Valantar (Oct 26, 2020)

RandallFlagg said:


> It's 179W on torture test.  Note that is (far) less than a 2700X stock on the same test:
> 
> View attachment 173421


Again, you are _very much_ misrepresenting the truth here. Are you even reading the graphs you are posting? 179W is _a lot more_ than the 125W of a stock 2700X in the same test. It is less than the 223W of an _overclocked_ 2700X in the same test, and the stock 9600K at 119W is _slightly_ lower than the 2700X at stock.



RandallFlagg said:


> If you go back and read what I was responding to - nevermind, i'll do that for you too.
> 
> It was a rebuttal to the veracity of AMDs performance claims.  In relation to Zen 3, that chart I posted about Zen 2 vs Gen 9, is pretty close to their current claims.  I seriously doubt you are going to see a 5600X unseat say a 10600K in gaming in the real world.   Match it - maybe.  Beat it in any significant way, not likely.


A) AMD's performance claims this go around are not only about a general IPC uplift, but also specific improvements in latency-sensitive applications like games. With the 3600X they made no such claims, only an overall IPC improvement. In other words, Ryzen 5000 is _specifically_ made to improve on this one point where Intel was still beating them, and it has a major architectural change (the unified L3 cache) as well as several smaller ones done with the express intent of achieving this.
B) With Zen 2 vs. Gen 9, AMD was going from a ~8% IPC deficit to a ~7% IPC advantage. This time they're going from a ~7% IPC advantage to a ((107*1.19)-100=) 27.33% IPC advantage. That is certainly a much bigger change than last time around in relation to Intel's current desktop chips. (Ice Lake and Tiger lake have ~18% higher IPC than Skylake, but are only available for mobile.)


----------



## Zach_01 (Oct 26, 2020)

RandallFlagg said:


> And what if I choose to user SuperPi, where a 4.3Ghz 10400 whacks a 4.5Ghz 3600XT to the tune of 8% better performance with 4% lower clocks?


Does it? I really dont know... Im asking.

And indeed how a SuperPi can determine CPU gaming performance... but nor a Passmark can.

And I didnt choose CB test for IPC determination uplift. AMD claimed that for an *avg* result over *many different workloads*. Not just Cinebench... So...


----------



## RandallFlagg (Oct 26, 2020)

Zach_01 said:


> Does it? I really dont know... Im asking.
> 
> And indeed how a SuperPi can determine CPU gaming performance... but nor a Passmark can.
> 
> And I didnt choose CB test for IPC determination uplift. AMD claimed that for an *avg* result over *many different workloads*. Not just Cinebench... So...



I'm pointing out that it is trivial for me to choose a workload that favors one cpu over the other, so how much more trivial is it for AMD to choose a workload -* or mix of workloads as you say* - that puts their product in the best light. SuperPi isn't the only one where Intel excels in such a way that it would refute IPC claims, it also excels in OCR applications and 2D -> 3D image conversion as well as AI benchmarks. *What say I make my workload mix made of those benchmarks?*

Remember, you're the one that brought up 20% increase in "IPC", *I simply said that IPC is meaningless*. 

So I think the real difference here, with Zen 3, is going to look like the difference between the 4.0Ghz 3100 and the 3300X on the chart in the link below, plus a few % for the clock increases.  You'll just get it across the board at all SKUs.   That would come out to more like +6-7%.  Which would bring them roughly to parity with Comet Lake.









						AMD Ryzen 3 3300X Review - The Magic of One CCX
					

AMD Ryzen 3 3300X crams all its cores into a single CCX. We tested the CCX impact in our review with impressive results, especially in games, where the new CPU design achieves great numbers that are close enough to more expensive Ryzen 5 and Ryzen 7 models, especially if you consider the cost...




					www.techpowerup.com


----------



## Zach_01 (Oct 26, 2020)

RandallFlagg said:


> I'm pointing out that it is trivial for me to choose a workload that favors one cpu over the other, so how much more trivial is it for AMD to choose a workload -* or mix of workloads as you say* - that puts their product in the best light. SuperPi isn't the only one where Intel excels in such a way that it would refute IPC claims, it also excels in OCR applications and 2D -> 3D image conversion as well as AI benchmarks. *What say I make my workload mix made of those benchmarks?*
> 
> Remember, you're the one that brought up 20% increase in "IPC", *I simply said that IPC is meaningless*.
> 
> ...


What exactly the statement of "+19% IPC uplift in geomean of 25 workloads" could mean?

By the way, the example of 3100 vs 3300X is irrelevant. The claimed +19 IPC uplift that AMD has stated is not only from the unified (and larger now) cache but from a lot more architectural improvements




I really dont think that AMD will risk to embarace its self with these kinds of slides and in the end will be on par with the competition while they increase prices. Even the rocks will laugh and they know it. This time I think its serious, and some are doing everything they can to minimize the gravity of it.





Those 2 CPUs have only 100MHz difference. Do they dare to show this and be on par with Intel? How much of a difference would that be between 3600X, 5600X and 10600k? I wonder.


----------



## Valantar (Oct 26, 2020)

RandallFlagg said:


> I'm pointing out that it is trivial for me to choose a workload that favors one cpu over the other, so how much more trivial is it for AMD to choose a workload -* or mix of workloads as you say* - that puts their product in the best light. SuperPi isn't the only one where Intel excels in such a way that it would refute IPC claims, it also excels in OCR applications and 2D -> 3D image conversion as well as AI benchmarks. *What say I make my workload mix made of those benchmarks?*
> 
> Remember, you're the one that brought up 20% increase in "IPC", *I simply said that IPC is meaningless*.
> 
> ...


IPC will always be a simplification, as you can never test all the workloads. But there are still industry standard benchmarks that are widely seen as reliable, such as SPECcpu. Anandtech uses that for their IPC testing, and found Zen 2 to both match AMD's claims (despite them not being based on testing with SPEC) and overall beat Skylake++++ by ~7%. This clearly doesn't cover gaming performance, but it doesn't cover SuperPi either, given that it's not actually a useful workload. There is as such little reason to expect AMD's numbers for Zen 3 to not be representative, though there will as alwyd be outlier tests.

As for gaming performance, they are claiming much, much more than 19% performance improvement: their wording is that the 5900X when compared to the 3900X is "from 5% to 50% improved" with the overall improvements in their testing at 26%. You're presenting it as if they were saying "IPC is better, so games are better", but that is very explicitly not what they are saying; the claims about gaming performance are separate from the IPC claims, and are about specific SKUs compared (in other words including clock speed increases). Again, there will obviously be examples where Intel still wins. But there's little reason to not expect AMD to be on the money when they claim to beat the 10900k overall in gaming.


----------



## RandallFlagg (Oct 27, 2020)

Here are the full results from that Passmark, comparison to 5950X and 10900 (nonK).

Some of these scores don't quite add up to the hype.  Keep in mind the i9-10900 here is limited on the graphics tests due to the 2070 Super VS 3080s in the two Ryzen boxes.

If these scores are real the performance on the 5600X is somewhat of a mixed bag.

What's funny here is, the one everyone posted is CPU Single Threaded MOps/Sec, ignoring everything else.



CPUAMD Ryzen 5 5600XAMD Ryzen 9 5950XIntel Core i9 10900GPURTX 3080RTX 3080RTX 2070 SuperMemoryCMW16GX4M2C3200C16F4-3600C16-8GTRGF4-3466C16-8GTZRDriveCT500P1SSD8T-Force TM8FP500Samsung 970 EVO Plus 1TBInteger Math (MOps./Sec.)7793620208984927Floating Point Math (MOps./Sec.)4036810608753041Prime Numbers (Million Primes/Sec.)13622682Extended Instructions (SSE) (Mill. Matrices/Sec.)173623814725627Compression (KBytes/Sec.)266462616165376500Encryption (MBytes/Sec.)16740403169003Physics (Frames/Sec.)118813661302Sorting (Thousand Strings/Sec.)278516787147355CPU Single Threaded (MOps./Sec.)349536933336Cross-platform Mark (Composite average)4408910276045984Simple Vectors (Thousand Vectors/Sec.)171517Fonts and Text (Ops./Sec.)281290328Windows Interface (Ops./Sec.)586884Image Filters (Filters/Sec)294626532673Image Rendering (Thousand Images/Sec)375318401Direct 2D (Frames/Sec.)757956PDF Rendering (Ops./Sec.)578770Direct 2D - SVG (Frames/Sec.)10010675DirectX 9 (Frames/Sec.)275143264DirectX 10 (Frames/Sec.)37360293DirectX 11 (Frames/Sec.)537141435DirectX 12 (Frames/Sec.)719061GPU Compute (Ops./Sec.)16466156969106Database Operations (KOps./Sec.)454581998444Memory Read Cached (MBytes/Sec.)350223738436442Memory Read Uncached (MBytes/Sec.)207912543120105Memory Write (MBytes/Sec.)120671798118359Available RAM (Megabytes)138691031727029Memory Latency (ns (lower is better))564124Memory Threaded (MBytes/Sec.)303795394543982Disk Sequential Read (MBytes/Sec.)18174342890Disk Sequential Write (MBytes/Sec.)9094002676IOPS 32KQD20 (MBytes/Sec.)3702991393IOPS 4KQD1 (MBytes/Sec.)853476CPU Mark (Composite average)2282445564246822D Graphics Mark (Composite average)112311821141Memory Mark (Composite average)269234384271Disk Mark (Composite average)113054147249953D Graphics Mark (Composite average)252581085020444PassMark Rating (Composite average)745474839018


----------



## Makaveli (Oct 27, 2020)

RandallFlagg said:


> Here are the full results from that Passmark, comparison to 5950X and 10900 (nonK).
> 
> Some of these scores don't quite add up to the hype.  Keep in mind the i9-10900 here is limited on the graphics tests due to the 2070 Super VS 3080s in the two Ryzen boxes.
> 
> ...





You still going on with the hard reaching? You are not going to stop anyone from going Zen 3 with these post nor are you some kinda guru of hidden truth. Give it up!

The 5950X is suppose to be faster then 5600X in single thread because of clock speed.

The later has a 4.9Ghz Turbo and the other is 4.6Ghz.


----------



## Valantar (Oct 30, 2020)

Makaveli said:


> You still going on with the hard reaching? You are not going to stop anyone from going Zen 3 with these post nor are you some kinda guru of hidden truth. Give it up!
> 
> The 5950X is suppose to be faster then 5600X in single thread because of clock speed.
> 
> The later has a 4.9Ghz Turbo and the other is 4.6Ghz.


Not to mention that in a lot of those tests a $300 6-core AMD CPU is beating a $500 10-core Intel CPU ... There are obviously use cases where the 10900K is faster still, but the more time passes, the fewer there seem to be.


----------

