# Intel Core i7-10700K Features 5.30 GHz Turbo Boost



## btarunr (Feb 12, 2020)

Intel's 10th generation Core "Comet Lake-S" desktop processor series inches chose to its probable April 2020 launch. Along the way we get this fascinating leak of the company's Core i7-10700K desktop processor, which could become a go-to chip for gamers if its specifications and pricing hold up. Thai PC enthusiast TUM_APISAK revealed what could be a Futuremark SystemInfo screenshot of the i7-10700K which confirms its clock speeds - 3.80 GHz nominal, with an impressive 5.30 GHz Turbo Boost. Intel is probably tapping into the series' increased maximum TDP of 125 W to clock these chips high across the board. 

The Core i7-10700K features 8 cores, and HyperThreading enables 16 threads. It also features 16 MB of shared L3 cache. In essence, this chip has the same muscle as the company's current mainstream desktop flagship, the i9-9900K, but demoted to the Core i7 brand extension. This could give it a sub-$400 price, letting it compete with the likes of AMD's Ryzen 7 3800X and possibly even triggering a price-cut on the 3900X. The i7-10700K in APISAK's screenshot is shown running on an ECS Z490H6-A2 motherboard, marking the company's return to premium Intel chipsets. ECS lacks Z390 or Z370 based motherboards in its lineup, and caps out at B360. 





*View at TechPowerUp Main Site*


----------



## Super XP (Feb 12, 2020)

This reminds me of the Pentium 4 days where Intel kept pushing higher clock speeds while AMD was innovating on CPU designs where AMD CPUs would beat P4's with up to 1000MHz lower clocks. That's how efficient and well designed the Athlon 64 was and beyond.


----------



## ARF (Feb 12, 2020)

This chip won't be DOA and not needed on the market only if it costs around $250-$300.



Super XP said:


> This reminds me of the Pentium 4 days where Intel kept pushing higher clock speeds while AMD was innovating on CPU designs where AMD CPUs would beat P4's with up to 1000MHz lower clocks. That's how efficient and well designed the Athlon 64 was and beyond.



That was because the Athlons had tremendously higher IPC, while the pentium was designed for high clocks with very long execution pipeline.


----------



## R0H1T (Feb 12, 2020)

btarunr said:


> This could give it a sub-$400 price


I doubt that, I'm guessing Intel will go with a $600~700 flagship this round. The rebranded 9900k will still retail at or around $500 *IMO*.


----------



## Super XP (Feb 12, 2020)

ARF said:


> This chip won't be DOA and not needed on the market only if it costs around $250-$300.
> 
> That was because the Athlons had tremendously higher IPC, while the pentium was designed for high clocks with very long execution pipeline.


Yes I remember that. That IMC helped AMD achieve higher IPC on top of everything else. Those were great times back in the day.


----------



## VrOtk (Feb 12, 2020)

You should look not at "boost" clock, but at stock one: it's the same or 100MHz above the 9900K (don't remember), so all-core OC capabilities are gonna be the same. And it's only up to the pricing.


----------



## londiste (Feb 12, 2020)

Super XP said:


> This reminds me of the Pentium 4 days where Intel kept pushing higher clock speeds while AMD was innovating on CPU designs where AMD CPUs would beat P4's with up to 1000MHz lower clocks. That's how efficient and well designed the Athlon 64 was and beyond.


Ironically, Intel is taking a page out of AMD's Zen2 playbook here. High single core clocks at the expense of high voltage and power consumption


----------



## Super XP (Feb 12, 2020)

londiste said:


> Ironically, Intel is taking a page out of AMD's Zen2 playbook here. High single core clocks at the expense of high voltage and power consumption


ZEN3 will rewrite that book. Lol
AMD is being very modest and careful with what information they reveal on ZEN3 and what they allow to be leaked to gage peoples reaction. 
I also noticed Dr. Lisa Su being somewhat cryptic when speaking about ZEN3. Something huge is coming and they are being quite silent about it.


----------



## GlacierNine (Feb 12, 2020)

Nobody going to talk about that 0nm manufacturing process? I'll be really impressed if Intel can pull that off.


----------



## Noztra (Feb 12, 2020)

125W TDP at 3.80 GHz.

300W+ TDP at 5,3 Ghz.

Several motherboard vendors have already said they are having issues, because of the 300W+ TDP.


----------



## ARF (Feb 12, 2020)

Noztra said:


> 125W TDP at 3.80 GHz.
> 
> 300W+ TDP at 5,3 Ghz.
> 
> Several motherboard vendors have already said they are having issues, because of the 300W+ TDP.



I think it will be an epic fail and the only way to escape from that punishment is not to buy it. Ever.

Especially when you can buy the 65-watt Ryzen 7 3700X for $310.



R0H1T said:


> I doubt that, I'm guessing Intel will go with a $600~700 flagship this round. The rebranded 9900k will still retail at or around $500 *IMO*.



If this is true, the 10-core i9-10900K flagship will cost $800-$900.


----------



## notb (Feb 12, 2020)

Super XP said:


> I also noticed Dr. Lisa Su being somewhat cryptic when speaking about ZEN3. Something huge is coming and they are being quite silent about it.


Or nothing significant is coming and they are silent?

AMD has usually been very vocal about incoming products - even when (or maybe: especially when) they offered a significant leap in performance.


Noztra said:


> Several motherboard vendors have already said they are having issues, because of the 300W+ TDP.


That's extremely unlikely considering motherboard makers have CPU power solutions that can provide 300W+ in Intel HEDT motherboards. They don't have to design anything new.
The only real problem could be in mITX.


Super XP said:


> This reminds me of the Pentium 4 days where Intel kept pushing higher clock speeds while AMD was innovating on CPU designs where AMD CPUs would beat P4's with up to 1000MHz lower clocks. That's how efficient and well designed the Athlon 64 was and beyond.


The fact that Intel had this power headroom helped them survived the period when competition leads in technology (which is inevitable from time to time). Intel pushed more cash into R&D and we all know what happened few years later.
Will history repeat itself this time? Maybe.
I seriously doubt AMD will make a similar mistake and once again fall to ~10% market share.
But on the other hand: CPU market has changed and AMD today is nowhere near their excellent situation from mid 2000s.



ARF said:


> If this is true, the 10-core i9-10900K flagship will cost $800-$900.


It will cost as much as Intel can ask.
I'm not sure why you mock the high price of current Intel flagships. You should admire them instead.


----------



## laszlo (Feb 12, 2020)




----------



## ARF (Feb 12, 2020)

notb said:


> It will cost as much as Intel can ask.
> I'm not sure why you mock the high price of current Intel flagships. You should admire them instead.



Don't you want cheaper Ryzen 9 3950X and Ryzen 9 3900X those are currently traded for as much as $750 and $470?


----------



## HwGeek (Feb 12, 2020)

Let me guess... this will be fastest Gaming CPU at 1080P with RTX 3080Ti .
Poor MB vendors, how they gonna sell those expensive MB's with expensive VRM for DIY market while AMD dominated this market?
Aren't they already stuck with huge X299 stock?, and now this?


----------



## Super XP (Feb 12, 2020)

notb said:


> Or nothing significant is coming and they are silent?
> 
> AMD has usually been very vocal about incoming products - even when (or maybe: especially when) they offered a significant leap in performance.
> 
> ...


Intel is where AMD is when they launched Bulldozer in 2011. The difference is Intel has more market share and part positive yet False  perception. 

AMD took a calculated and innovative risk and bet on the wrong horse. They had there AsS handed to them by Intel. 

Today it's Intel's turn to have there AsS handed to them by AMD superior CPUs. Basically Intel deserves this as there arrogance made them complacent. Intel was so desperate after ZEN launched that they hired Jim Keller in 2018. He's one of the best CPU architects of our time. It took AMD almost 6 years to design and release ZEN. It's probably going to take Intel about the same time.  

In the meantime,  here hoping AMD strips Intel's market share to pieces as AMD deserves it more.


----------



## Octopuss (Feb 12, 2020)

I already have central heating, thanks a lot.
It would take a lot more than this to impress me


----------



## londiste (Feb 12, 2020)

Noztra said:


> 125W TDP at 3.80 GHz.
> 300W+ TDP at 5,3 Ghz.
> Several motherboard vendors have already said they are having issues, because of the 300W+ TDP.


Boost Clock is effectively single-core.


----------



## Noztra (Feb 12, 2020)

notb said:


> That's extremely unlikely considering motherboard makers have CPU power solutions that can provide 300W+ in Intel HEDT motherboards. They don't have to design anything new.
> The only real problem could be in mITX.



No s*hit sherlock. But 10900K is not a HEDT part, is it? Its gonna run on normal motherboards. And apparently its *extremely likely *since *several* motherboard manufacturers have claimed that they are having issues.









						Intel's 10-core Comet Lake-S CPUs could draw up to 300W - VideoCardz.com
					

Intel has not revealed any details on Comet Lake-S at CES 2020. Motherboard makers are ready, but Intel is not According to ComputerBase, who interviewed motherboard makers at the CES 2020 and also had taken part in Intel’s conference, Intel is facing issues with its 10-core 14nm Comet Lake-S...




					videocardz.com
				




But ofc you know better than everyone else, because you really like Intel.


----------



## notb (Feb 12, 2020)

ppn said:


> 300+ watts on socket 1200. It will burn the contacts over time. Yeas it may work for a while.


Considering 9900K are known to pull around 250W out of LGA1151v2 (146 power pins), there's no reason why LGA1200 wouldn't manage 300W.
That's 20% more power, i.e. 30 out of 49 additional pins would have to be responsible for power delivery (VCC type) to prevent higher current.


HwGeek said:


> Aren't they already stuck with huge X299 stock?, and now this?


X299 was never made in high volumes, so there's certainly no huge stock.


ARF said:


> Don't you want cheaper Ryzen 9 3950X and Ryzen 9 3900X those are currently traded for as much as $750 and $470?


They are sold for as much as AMD and middlemen can. That's how business work.
Yes, Intel sells their 8 cores for $400-500, while AMD's competing products are #300-350 (and $750 for 16 cores). This is certainly not a reason to mock Intel.

Would I want PC components to be cheaper? Of course. Just like any other product: food, shoes, cars.
Would I want PC component makers to be as profitable as AMD is right now? Definitely not.


Noztra said:


> No s*hit sherlock. But 10900K is not a HEDT part, is it? Its gonna run on normal motherboards. And apparently its *extremely likely *since *several* motherboard manufacturers have claimed that they are having issues.


So I said that the only thing motherboard makers have to do is use the power delivery setups they already designed for HEDT (dual 8-pin, more robust VRM).
And I asked for some sources to the "*several* motherboard manufacturers have claimed that they are having issues" theory.
The article you've provided is titled: "Motherboard makers are ready, but Intel is not".


----------



## yeeeeman (Feb 12, 2020)

Super XP said:


> This reminds me of the Pentium 4 days where Intel kept pushing higher clock speeds while AMD was innovating on CPU designs where AMD CPUs would beat P4's with up to 1000MHz lower clocks. That's how efficient and well designed the Athlon 64 was and beyond.


AMD made the same mistake a couple years after with Buldozer. My, how people forget.
Intel is not pushing frequency now because it was meant to be. They just can't fab any new design that big/cost effective enough yet so they squeeze what they can from 14nm.


----------



## Noztra (Feb 12, 2020)

notb said:


> So I said that the only thing motherboard makers have to do is use the power delivery setups they already designed for HEDT (dual 8-pin, more robust VRM).



So motherboard makers have to redesign there entire stack or motherboards, so Intel's 10xxx series won't melt there boards. And what if people don't wanna buy a new motherboard and plug 10xxx in the "old" one, which doesn't have dual 8-pin, more robust VRM?

And using dual 8-pin, more robust VRM, etc = increase cost.


----------



## vMax65 (Feb 12, 2020)

As a gaming CPU it should be absolutely fine especially if it is well under $400 which I am sure it will be due to well deserved competition from AMD and that is a great thing for us the consumer. For the mainstream gaming and general use community and not those running semi or pro workloads this could be a very good CPU otherwise go AMD. 8 cores and 16 threads is way more than enough for any game and if it boosts to 5 GHz and above at a good price we would have good choices in the CPU space. As a gamer, TDP has never been a factor especially when overclocking which is what these CPU's pretty much demand. Not sure I have ever understood this AMD versus Intel versus Nvidia crud...Just buy the product you want that suits you and your budget...I have had AMD, Intel and Nvidia systems and parts through my long PC life and I will only ever buy what suits my budget regardless of brand..


----------



## Object55 (Feb 12, 2020)

5.3 - all security patches is more like 4.1


----------



## ARF (Feb 12, 2020)

notb said:


> They are sold for as much as AMD and middlemen can. That's how business work.
> Yes, Intel sells their 8 cores for $400-500, while AMD's competing products are #300-350 (and $750 for 16 cores). This is certainly not a reason to mock Intel.
> 
> Would I want PC components to be cheaper? Of course. Just like any other product: food, shoes, cars.
> Would I want PC component makers to be as profitable as AMD is right now? Definitely not.



AMD is profitable enough, its share is among if not the best performing stock on the market. Currently trading for $54.50 and rising during the pre-market.
Let's not be greedy.
Let's not forget that the consumers have no choice, because the market is actually duopoly.


----------



## kapone32 (Feb 12, 2020)

vMax65 said:


> As a gaming CPU it should be absolutely fine especially if it is well under $400 which I am sure it will be due to well deserved competition from AMD and that is a great thing for us the consumer. For the mainstream gaming and general use community and not those running semi or pro workloads this could be a very good CPU otherwise go AMD. 8 cores and 16 threads is way more than enough for any game and if it boosts to 5 GHz and above at a good price we would have good choices in the CPU space. As a gamer, TDP has never been a factor especially when overclocking which is what these CPU's pretty much demand. Not sure I have ever understood this AMD versus Intel versus Nvidia crud...Just buy the product you want that suits you and your budget...I have had AMD, Intel and Nvidia systems and parts through my long PC life and I will only ever buy what suits my budget regardless of brand..



If this CPU sells for $400 I will be pleasantly surprised. It is more likely that it will be north of $500. Though TDP might not bother you I am pretty sure you will need a beefy quality air cooler or at least a 240MM AIO to keep a beast like this cool. It ilke what Super XP said this is the 9590 equivalent vs the 8350. It better launch soon though because once Ryzen 3 or 4 or whatever it is launches this CPU may become moot vs the replacement to the 3700 for Gaming as AMD is only about 7-10% away from Intel in pure gaming (It is not black and white either).


----------



## Deleted member 158293 (Feb 12, 2020)

On the one hand it would give my old EVGA 1300w G2 PSU a good run again and heat a room or 2 in the process.

Just that it isn't worth it anymore...


----------



## vMax65 (Feb 12, 2020)

kapone32 said:


> If this CPU sells for $400 I will be pleasantly surprised. It is more likely that it will be north of $500. Though TDP might not bother you I am pretty sure you will need a beefy quality air cooler or at least a 240MM AIO to keep a beast like this cool. It ilke what Super XP said this is the 9590 equivalent vs the 8350. It better launch soon though because once Ryzen 3 or 4 or whatever it is launches this CPU may become moot vs the replacement to the 3700 for Gaming as AMD is only about 7-10% away from Intel in pure gaming (It is not black and white either).



More than understand but with the competition if it is not well under $400 and more like $300 it will be dead in the water. Intel have to compete and they have the money in the bank to compete and could drop there prices in a second to undercut AMD...but of course they won't go all out (as profit matters!)...still they will have to drop prices significantly. As to the difference in performance even if it is only small, it just depends on price...

As to TDP and I always overclock my CPU's since the good old days, and I have used AIO's with a 360mm AIO currently which tames just about any mainstream CPU.

Bottom line it is a great time for PC's and with AMD's bringing real competition into the CPU space, we have never had it so good...Long may it continue...


----------



## kapone32 (Feb 12, 2020)

vMax65 said:


> More than understand but with the competition if it is not well under $400 and more like $300 it will be dead in the water. Intel have to compete and they have the money in the bank to compete and could drop there prices in a second to undercut AMD...but of course they won't go all out...still they will have to drop prices significantly. As to the difference in performance even if it is only small, it just depends on price...
> 
> As to TDP and I always overclock my CPU's since the good old days, and I have used AIO's with a 360mm AIO currently which tames just about any mainstream CPU.



Well they will still have their years of propaganda and questionable marketing practices to rely on for now. Even north of $500 there will be plenty of people that buy this so I don't see this being cheaper than the 9900K. Where the potential danger for Intel is in the laptop space as those new AMD mobile chips seem to be very disruptive. 

AIOs (I know there will be comments to the viability) are a no brainer for CPUs and for me the bigger they are the better they are at dissipating heat. I personally have a couple of 420MM rads to cool my setup. The only thing I was establishing is you will need something powerful to cool this and get the advertised boost clocks.


----------



## QUANTUMPHYSICS (Feb 12, 2020)

My next computer will be a 10000 CPU or a Core i11 Extreme


----------



## Elysium (Feb 12, 2020)

Looks like just shy of 5.3g, so perhaps more in line with the initial SKU leak several months ago, although the stock clock back then was reported as 3.6g rather than 3.8. RRP appeared to come in at $389, far less than $500.


----------



## notb (Feb 12, 2020)

Noztra said:


> So motherboard makers have to redesign there entire stack or motherboards, so Intel's 10xxx series won't melt there boards.


Redesign what? They're making new motherboards using the new socket.


> And what if people don't wanna buy a new motherboard and plug 10xxx in the "old" one, which doesn't have dual 8-pin, more robust VRM?


Sockets aren't compatible.


> And using dual 8-pin, more robust VRM, etc = increase cost.


Yes. Enthusiast/workstation mobos (Z490, W480) will be expensive and cheaper models may limit 10-core Comet Lake's boost.



ARF said:


> AMD is profitable enough, its share is among if not the best performing stock on the market. Currently trading for $54.50 and rising during the pre-market.


AMD's share price has almost nothing to do with current or even forecasted earnings. It's based on very optimistic future projections. That's the problem.


----------



## vMax65 (Feb 12, 2020)

kapone32 said:


> Well they will still have their years of propaganda and questionable marketing practices to rely on for now. Even north of $500 there will be plenty of people that buy this so I don't see this being cheaper than the 9900K. Where the potential danger for Intel is in the laptop space as those new AMD mobile chips seem to be very disruptive.
> 
> AIOs (I know there will be comments to the viability) are a no brainer for CPUs and for me the bigger they are the better they are at dissipating heat. I personally have a couple of 420MM rads to cool my setup. The only thing I was establishing is you will need something powerful to cool this and get the advertised boost clocks.



Agree and propaganda/marketing is done by all across every industry/sector and those with the most money will do whatever they can to get the edge...ultimately research before buying.
As to the mobile sector, I think Intel is working extremely hard in this space with Foveros and Xe and there new 10nm packages so this is not a space the will want to lose..
Couldn't agree more on AIO's the bigger the better...no space for a 420mm AIO for me but I will get there...


----------



## DeathtoGnomes (Feb 12, 2020)

Noztra said:


> 125W TDP at 3.80 GHz.
> 
> 300W+ TDP at 5,3 Ghz.
> 
> *Several motherboard vendors have already said they are having issues, because of the 300W+ TDP.*



source?

---

I highly doubt Intel will release a new chip that out performs its predecessor at a lower price.  But, depending on demand (and supply), on-sale prices  could be very close.


----------



## bug (Feb 12, 2020)

Super XP said:


> Yes I remember that. That IMC helped AMD achieve higher IPC on top of everything else. Those were great times back in the day.


It wasn't the IMC, AthlonXP didn't have one. It was the shorter pipeline


----------



## dirtyferret (Feb 12, 2020)

Super XP said:


> Intel is where AMD is when they launched Bulldozer in 2011.



  hyperbole?


----------



## cucker tarlson (Feb 12, 2020)

londiste said:


> Ironically, Intel is taking a page out of AMD's Zen2 playbook here. High single core clocks at the expense of high voltage and power consumption


I'm not touching anything that uses over 1.4v,let alone +1.55v like 3000
If 10th gen does super high voltage for boost clocks I'm going 9700K and I don't mind the thread count


----------



## nickbaldwin86 (Feb 12, 2020)

I run a i3-8350K @ 5Ghz.... it was dirt cheap and it plays every game and is the last point of bottleneck in my PC.

Going to be a while until I see a "new aged" CPU that I care about.

This thread is depressing to read. People really get worked up about something that is so meaningless. AMD vs INTEL. Grow up people.


----------



## cucker tarlson (Feb 12, 2020)

nickbaldwin86 said:


> I run a i3-8350K @ 5Ghz.... it was dirt cheap and it *plays every game *and is the last point of bottleneck in my PC.
> 
> Going to be a while until I see a "new aged" CPU that I care about.
> 
> This thread is depressing to read. People really get worked up about something that is so meaningless. AMD vs INTEL. Grow up people.


while technically correct,it all depends on what you need it to do.


----------



## ARF (Feb 12, 2020)

nickbaldwin86 said:


> I run a i3-8350K @ 5Ghz.... it was dirt cheap and it plays every game and is the last point of bottleneck in my PC.



There are games where it will stutter, in BF1, for example, you can max out a 4c/4t cpu, which means you get stutter when you hit 100% usage.

It's for your interest to take AMD Ryzen - smoother gameplay because of the more cores/threads, for less money and in lower power envelope.
No need for industrial chillers, just works cool&quiet.


----------



## bug (Feb 12, 2020)

nickbaldwin86 said:


> I run a i3-8350K @ 5Ghz.... it was dirt cheap and it plays every game and is the last point of bottleneck in my PC.
> 
> Going to be a while until I see a "new aged" CPU that I care about.
> 
> This thread is depressing to read. People really get worked up about something that is so meaningless. AMD vs INTEL. Grow up people.


Yeah, well, it gives you the impression you contributed something when you have nothing to do 


cucker tarlson said:


> while technically correct,it all depends on what you need it to do.


I think he was pretty clear: he games. And Intel still holds the crown for office applications, because those don't multithread well. AMD rules synthetics and very few things you routinely run at home.


----------



## AnarchoPrimitiv (Feb 12, 2020)

HwGeek said:


> Let me guess... this will be fastest Gaming CPU at 1080P with RTX 3080Ti .
> Poor MB vendors, how they gonna sell those expensive MB's with expensive VRM for DIY market while AMD dominated this market?
> Aren't they already stuck with huge X299 stock?, and now this?



I'm glad someone pointed this out!  The fact is that Intel is NOT the fastest at gaming IN GENERAL, they're ONLY the fastest at gaming in the specific instance of low resolution (1080p or lower), high refresh, and with a 2080 ti (or whatever the top consumer card may be at the moment).


----------



## bug (Feb 12, 2020)

AnarchoPrimitiv said:


> I'm glad someone pointed this out!  The fact is that Intel is NOT the fastest at gaming IN GENERAL, they're ONLY the fastest at gaming in the specific instance of low resolution (1080p or lower), high refresh, and with a 2080 ti (or whatever the top consumer card may be at the moment).


Intel is faster in general. But in many instances they win by less than 10%, which is something you won't notice with the naked eye, so those are essentially ties.


----------



## cucker tarlson (Feb 12, 2020)

AnarchoPrimitiv said:


> I'm glad someone pointed this out!  The fact is that Intel is NOT the fastest at gaming IN GENERAL, they're ONLY the fastest at gaming in the specific instance of low resolution (1080p or lower), high refresh, and with a 2080 ti (or whatever the top consumer card may be at the moment).


how can a cpu be faster for 1080p,high refresh rate and with a high end gpu only but not the fastest in general ?


----------



## catulitechup (Feb 12, 2020)

nickbaldwin86 said:


> I run a i3-8350K @ 5Ghz.... it was dirt cheap and it plays every game and is the last point of bottleneck in my PC.
> 
> Going to be a while until I see a "new aged" CPU that I care about.
> 
> This thread is depressing to read. People really get worked up about something that is so meaningless. AMD vs INTEL. Grow up people.



In my case have too i3 8350K @5.0ghz (1 core disabled - no avx) since begins 2018, when bought i3 amd dont have anything with same performance per core

In my case use lower thread apps, many legacy apps (in most cases 3 cores or less) and use screen resolution closer to 720p (most critical scenary for ryzen)

However now* zen 3 seems very interesting, maybe think about change depending performance per core in quad core models (no ht) or six core (no ht)

*in this moment dont buy intel, maybe for specific case (most type legacy app or certified intel app) but in gaming no buy intel for now


----------



## ppn (Feb 12, 2020)

cucker tarlson said:


> I'm not touching anything that uses over 1.4v,let alone +1.55v like 3000
> If 10th gen does super high voltage for boost clocks I'm going 9700K and I don't mind the thread count


10th likely requires 100mV less voltage at equal clocks. so there is no reason go 9700K, unless it is 20% cheaper. but for what the Z370 motherboard would be a total waste no future proof. Just manually set to 1.25 and call it a day.


----------



## cucker tarlson (Feb 12, 2020)

ppn said:


> 10th likely requires 100mV less voltage at equal clocks. so there is no reason go 9700K, unless it is 20% cheaper. but for what the Z370 motherboard would be a total waste no future proof.


future proofing your platform has more implications than you think.
involves buying a recent board at launch prices,recent cpu at launch price,and then selling the cpu later to get a new one,again at launch price.

buying z370 and 9700k once 10th gen arrives ?
boards are dirt cheap and cpus will likely drop in price substantially.


----------



## JustNiz (Feb 12, 2020)

Where is this 5.3 turbo info coming from? everwhere else is saying the 10700K will do 5.1GHz.
I can't imagine 5.3 to be true, or if it is, there has to be some giant catch compared to the 10900K boost.
If the 10700K boosts to 5.3 and the 10900K only boosts to 5.1, then the 10700K will give better performance for nearly everything (including gaming) than the10900K, as hardly anything actually uses, let alone maxes out 10 cores (or 20 cores if you count hyperthreaded).
I just can't imagine Intel sitting still for a 10700K having better real-world performance than their flagship 10900K.


----------



## cucker tarlson (Feb 12, 2020)

JustNiz said:


> Where is this 5.3 turbo info coming from? everwhere else is saying the 10700K will do 5.1GHz.
> I can't imagine 5.3 to be true, or if it is, there has to be some giant catch compared to the 10900K boost.
> If the 10700K boosts to 5.3 and the 10900K only boosts to 5.1, then the 10700K will give better performance for nearly everything (including gaming) than the10900K, as hardly anything actually uses, let alone maxes out 10 cores (or 20 cores if you count hyperthreaded).
> I just can't imagine Intel sitting still for a 10700K having better real-world performance than their flagship 10900K.


I don't think IVB will restrain 10900k from clocking as high as 10700k whenever 10700k sees it possible to hit 5.3ghz


----------



## dicktracy (Feb 12, 2020)

A 2015 CPU arch is still the fastest in gaming. Let that sink in.


----------



## cucker tarlson (Feb 12, 2020)

dicktracy said:


> A 2015 CPU arch is still the fastest in gaming. Let that sink in.


clocks and latency.
ryzen is taking mainstream into a totally different direction so no surprises.


----------



## HD64G (Feb 12, 2020)

Same insecure arch with ultra-high power consumption, expensive chipset and motherboard. Not a product for 2020 imho even with a discount vs the 9-series.


----------



## Turmania (Feb 12, 2020)

Super XP said:


> This reminds me of the Pentium 4 days where Intel kept pushing higher clock speeds while AMD was innovating on CPU designs where AMD CPUs would beat P4's with up to 1000MHz lower clocks. That's how efficient and well designed the Athlon 64 was and beyond.


too bad it was very short lived, lets hope after almost 20 years AMD can actually sustain it with Ryzen.


----------



## Super XP (Feb 12, 2020)

yeeeeman said:


> AMD made the same mistake a couple years after with Buldozer. My, how people forget.
> Intel is not pushing frequency now because it was meant to be. They just can't fab any new design that big/cost effective enough yet so they squeeze what they can from 14nm.


Absolutely NOT. Do not compare Bulldozer to the Pentium 4 as AMDs desire was to mindlessly boost clocks speeds. That was not the reason why AMD designed Bulldozer.

AMD choose a different route with Bulldozer, as a way to rapidly increase CPU cores instead of supporting SMT like Intel AMD went the physical route and found a way to increase CPU cores instead. 

Obviously this didn't work, which is why AMD hired Jim Keller, CPU Architect genius.


----------



## ARF (Feb 12, 2020)

Turmania said:


> too bad it was very short lived, lets hope after almost 20 years AMD can actually sustain it with Ryzen.



We have the chiplets technology which enables AMD to be competitive regardless of any potential big IPC changes against Ryzen.

You can have the 64-core Ryzen Threadripper 3990X and it only needs future shrinks which will be sufficient enough.


----------



## Super XP (Feb 12, 2020)

QUANTUMPHYSICS said:


> My next computer will be a 10000 CPU or a Core i11 Extreme









ARF said:


> We have the chiplets technology which enables AMD to be competitive regardless of any potential big IPC changes against Ryzen.
> 
> You can have the 64-core Ryzen Threadripper 3990X and it only needs future shrinks which will be sufficient enough.


AMD won't make that error again. Future ZENs will never be a simple die shrink. With each generation release,  AMD is basically re-designing each processor gen for significant IPC performance. 

A great example of this is what we are seeing today. ZEN to ZEN2 to upcoming ZEN3 etc., 
Note: ZEN+ was suppose to have been the original ZEN. That's why AMD officially announced no more plain old refreshes like for example a ZEN2+ or a ZEN3+...



kapone32 said:


> Well they will still have their years of propaganda and questionable marketing practices to rely on for now. Even north of $500 there will be plenty of people that buy this so I don't see this being cheaper than the 9900K. Where the potential danger for Intel is in the laptop space as those new AMD mobile chips seem to be very disruptive.
> 
> AIOs (I know there will be comments to the viability) are a no brainer for CPUs and for me the bigger they are the better they are at dissipating heat. I personally have a couple of 420MM rads to cool my setup. The only thing I was establishing is you will need something powerful to cool this and get the advertised boost clocks.


The only people that will buy Intel Processors are none tech savvy people and those that don't research tech on a daily basis. And still to date, many retailers still pushing Intel for general use computing and gaming despite Ryzen being far superior. But not everywhere though, as Ryzen does very well online with real reviews that people read then decide.


----------



## nickbaldwin86 (Feb 12, 2020)

ARF said:


> There are games where it will stutter, in BF1, for example, you can max out a 4c/4t cpu, which means you get stutter when you hit 100% usage.
> 
> It's for your interest to take AMD Ryzen - smoother gameplay because of the more cores/threads, for less money and in lower power envelope.
> No need for industrial chillers, just works cool&quiet.



Play all the newest titles and old... I have yet to see my CPU become a noticeable bottleneck. a benchmark isn't enough to make me spend money to upgrade.  I rarely get under 100FPS on my 2560x1080 monitor and most games can tweak settings to play at 166FPS to hit the 166Hz for the monitor.

Does great for me I am not trying to say my 8350K wouldn't be a bottleneck on a system with two 2080Ti ... because in that case yes it would bottleneck that system.

I was more pointing out that people just getting so mad about something so little and something they can't even control. You can control it with your buying habits, yelling at each other in a forum post isn't doing anything.


----------



## ARF (Feb 12, 2020)

Super XP said:


> AMD won't make that error again. Future ZENs will never be a simple die shrink. With each generation release,  AMD is basically re-designing each processor gen for significant IPC performance.
> 
> A great example of this is what we are seeing today. ZEN to ZEN2 to upcoming ZEN3 etc.,
> Note: ZEN+ was suppose to have been the original ZEN. That's why AMD officially announced no more plain old refreshes like for example a ZEN2+ or a ZEN3+....



Well, the situation isn't that bad. If you compare multi-threaded apps, you would see that the FX-8350 is actually a bit ahead of Core i7-2600K, despite the 25-30% IPC disadvantage.
Single core performance is no longer important, and in the future it will become ever less important.


----------



## efikkan (Feb 12, 2020)

HD64G said:


> Same insecure arch with ultra-high power consumption, expensive chipset and motherboard. Not a product for 2020 imho even with a discount vs the 9-series.


Statistically speaking, a long-lived architecture with refinements is much more likely to have identified and eliminated larger bugs and problems.

Skylake may be on "life support", not because it's bad, it's very impressive for an architecture from 2015, but because we need higher IPC to scale further and these current clock speeds are really not sustainable.


----------



## Super XP (Feb 12, 2020)

ARF said:


> Well, the situation isn't that bad. If you compare multi-threaded apps, you would see that the FX-8350 is actually a bit ahead of Core i7-2600K, despite the 25-30% IPC disadvantage.
> Single core performance is no longer important, and in the future it will become ever less important.


In PC Gaming single core performance is important unfortunately. 
Hopefully developers further take advantage of at least 16 threaded CPUs as a default standard. This should be pushed industry wide ASAP. 

Intel holds a very slight lead in gaming performance over Ryzen. ZEN3 will eliminate that lead based on what information is available about it.


----------



## efikkan (Feb 12, 2020)

Super XP said:


> In PC Gaming single core performance is important unfortunately.
> Hopefully developers further take advantage of at least 16 threaded CPUs as a default standard. This should be pushed industry wide ASAP.
> 
> Intel holds a very slight lead in gaming performance over Ryzen. ZEN3 will eliminate that lead based on what information is available about it.


Games will not be able to scale well with many threads. The workloads of a game is basically divided into three categories; 1) rendering, 2) core game simulation 3) optional  stuff (like networking, sound effects etc.). Rendering(1) only scales until the GPU is no longer bottlenecked. While some games can leverage multiple queues to do multi-pass rendering etc., the scaling potential here is fairly limited anyway, and the thread(s) feeding the GPU is usually the only ones which matters for FPS anyway. Also, the future long-term trend in games is that the GPU does more and more of the heavy lifting. Game simulation(2)(game loop) is usually a fixed workload, since the game needs to work the same across machines, and is usually scaled towards the minimum requirements of a game. Games are also incredible latency sensitive, which makes it very hard to divide timing critical tasks into tiny chunks across many cores. We might see some more multi-core scaling in gaming, but games will not use 16 threads even 10 years from no (except for a couple of edge cases, of course).

Applications in general do slowly scale better with multiple threads, but even here there are theoretical limits to what is achievable. Even with the best efforts, most synchronized workloads (like most client applications are) will see a scaling drop-off around 8 threads, and not scale well beyond 16 threads. Asynchronous workloads scale nearly perfectly though, but very few of those are relevant for end-users. Anything beyond 8 cores is more relevant for users running multiple applications simultaneously rather than single applications.


----------



## RealNeil (Feb 12, 2020)

R0H1T said:


> I doubt that, I'm guessing Intel will go with a $600~700 flagship this round.


If they do more of the same with their prices, I and many others will do without it.


----------



## bug (Feb 12, 2020)

cucker tarlson said:


> how can a cpu be faster for 1080p,high refresh rate and with a high end gpu only but not the fastest in general ?


Easy: once you crank up resolution, the CPU is no longer the bottleneck, so it doesn't matter if it's the faster or not. Though if I had to choose, I'd still go for the CPU that can push geometry to the GPU faster. Because of those frame dips, you know.


----------



## cucker tarlson (Feb 12, 2020)

bug said:


> Easy: once you crank up resolution, the CPU is no longer the bottleneck, so it doesn't matter if it's the faster or not. Though if I had to choose, I'd still go for the CPU that can push geometry to the GPU faster. Because of those frame dips, you know.


it doesn't matter is same as it isn't better ?

when I'm loading games on my 128gb sata ssd I get roughly the same times as a 970 Pro owner.so 970 pro isn't better in general.


----------



## bug (Feb 12, 2020)

cucker tarlson said:


> it doesn't matter is same as it isn't better ?
> 
> when I'm loading games on my 128gb sata ssd I get roughly the same times as a 970 Pro owner.so 970 pro isn't better in general.


Well, that's a tough one.
The 970 will win more synthetics, but if nothing you do with it is faster irl, is it really faster _in general_? Is a Ferrari faster than a Corolla _in general_, if all you do with it is drive through town?


----------



## cucker tarlson (Feb 12, 2020)

bug said:


> Well, that's a tough one.


no.


bug said:


> Is a Ferrari faster than a Corolla _in general_, if all you do with it is drive through town?


yes.


----------



## bug (Feb 12, 2020)

cucker tarlson said:


> yes.


And yet, it's not getting you faster from point A to point B.


----------



## cucker tarlson (Feb 12, 2020)

bug said:


> And yet, it's not getting you faster from point A to point B.


cause you're driving slow.

what the hell am I even expleining right now.


----------



## bug (Feb 12, 2020)

cucker tarlson said:


> cause you're driving slow.


You're getting warm: usage patterns matter.


----------



## cucker tarlson (Feb 12, 2020)

bug said:


> You're getting warm: usage patterns matter.


for a user,not for objective truth.


----------



## bug (Feb 12, 2020)

cucker tarlson said:


> for a user,not for objective truth.


To me, the user is more true than lab conditions.


----------



## Minus Infinity (Feb 12, 2020)

Object55 said:


> 5.3 - all security patches is more like 4.1



Ha ha, indeed closer to the truth. Couldn't care less if it clocked to 10GHz, still wouldn't interest me. Zen 3 4700X or 4900X is all I want for my new build.


----------



## Mistral (Feb 13, 2020)

Can anyone please confirm if this comes boxed with a chiller or not?..


----------



## Dave65 (Feb 13, 2020)

This will put Intel back on top.................................................. As long as you live in an Igloo


----------



## TranceHead (Feb 13, 2020)

GlacierNine said:


> Nobody going to talk about that 0nm manufacturing process? I'll be really impressed if Intel can pull that off.


They once measured in microns (um)
Once they hit sub 1 micron they started measuring in nanons (nm)
I suspect once they get to 1nm or under they'll start announcing architecture lithography by picons (pm) 1000pm to 1nm.


----------



## GoldenX (Feb 13, 2020)

Ok, so nothing relevant or new from Intel until 2021, what a joke.
AMD is going to increase prices, get ready.


----------



## dicktracy (Feb 13, 2020)

cucker tarlson said:


> clocks and latency.
> ryzen is taking mainstream into a totally different direction so no surprises.


Also why I returned my 1700 and went with 5Ghz 8700k for chart topping gaming performance that’ll actually last me for years instead of having to constantly upgrade with Ryzen Zen -> Zen+ -> Zen2 and STILL be slower than a 5 year old Intel arch lol. The CCX rumors better be true for AMD’s sakes.


----------



## ToxicTaZ (Feb 13, 2020)

laszlo said:


> View attachment 144617



That's what happens when you try running your AMD 3000 series CPUs @INTEL @5GHz+ speeds lol


----------



## Rob94hawk (Feb 13, 2020)

Base clock should have been 5 Ghz a long time ago. That being said lets see the benchmark comparisons.


----------



## ToxicTaZ (Feb 13, 2020)

Rob94hawk said:


> Base clock should have been 5 Ghz a long time ago. That being said lets see the benchmark comparisons.



Base clocks are still under 4GHz and only Turbo goes to 5GHz+

I'm running my 9900KS @5.2GHz AVX2 all cores so technically it's 5GHz base with 5.2GHz Turbo. 

I'm betting 10700K is only running 2 cores 5.3GHz on Turbo OC. The CPUs are made for your average 200w CPU cooler.


----------



## R0H1T (Feb 13, 2020)

GoldenX said:


> AMD is going to increase prices, get ready.


Highly unlikely, while they are selling as many as they can make right now the step up from 2990WX to 3990X (in terms of price) as well as 2700x to 3950x has allowed Intel to offer better VFM across lots of price points. Which basically means that if AMD do go even higher, especially for their mainstream flagship, their sales could go down at least in proportion to what they're currently enjoying in the DIT segment.


----------



## ARF (Feb 13, 2020)

efikkan said:


> Games will not be able to scale well with many threads. The workloads of a game is basically divided into three categories; 1) rendering, 2) core game simulation 3) optional  stuff (like networking, sound effects etc.). Rendering(1) only scales until the GPU is no longer bottlenecked. While some games can leverage multiple queues to do multi-pass rendering etc., the scaling potential here is fairly limited anyway, and the thread(s) feeding the GPU is usually the only ones which matters for FPS anyway. Also, the future long-term trend in games is that the GPU does more and more of the heavy lifting. Game simulation(2)(game loop) is usually a fixed workload, since the game needs to work the same across machines, and is usually scaled towards the minimum requirements of a game. Games are also incredible latency sensitive, which makes it very hard to divide timing critical tasks into tiny chunks across many cores. We might see some more multi-core scaling in gaming, but games will not use 16 threads even 10 years from no (except for a couple of edge cases, of course).
> 
> Applications in general do slowly scale better with multiple threads, but even here there are theoretical limits to what is achievable. Even with the best efforts, most synchronized workloads (like most client applications are) will see a scaling drop-off around 8 threads, and not scale well beyond 16 threads. Asynchronous workloads scale nearly perfectly though, but very few of those are relevant for end-users. Anything beyond 8 cores is more relevant for users running multiple applications simultaneously rather than single applications.



The CPUs in the new consoles arriving later in 2020 will have 8 cores and 16 threads.
Games can use as many threads as you throw at them, because there are areas like physics, AI acceleration, ray-tracing acceleration all of which will greatly benefit from many cores.

I can't wait to try the 64-core Threadripper 3990X or its derivatives shrunk to lower TDP envelopes.


----------



## dont whant to set it"' (Feb 13, 2020)

Not that I would care, yet I had no clue if Intel was competing against nuclear fision , regarding thermal power density. Good for them , good for Intel.


----------



## ToxicTaZ (Feb 13, 2020)

ARF said:


> The CPUs in the new consoles arriving later in 2020 will have 8 cores and 16 threads.
> Games can use as many threads as you throw at them, because there are areas like physics, AI acceleration, ray-tracing acceleration all of which will greatly benefit from many cores.
> 
> I can't wait to try the 64-core Threadripper 3990X or its derivatives shrunk to lower TDP envelopes.



Gaming does not benefit from more cores yet!!! Thus the 9900KS is the fastest Gaming CPU! (And yes the 9900KS is the gaming king!) And yes the 9900KS will beat the 3990X in PC gaming. Plus the 3990X is 280w Watts monster if your talking about TDP and the mortgage you need to take out to buy one lol. 

We are still living in a world of 4 cores still where the money making entry level CPUs thrive your i5 & R5 series people.


----------



## kapone32 (Feb 13, 2020)

ToxicTaZ said:


> Gaming does not benefit from more cores yet!!! Thus the 9900KS is the fastest Gaming CPU! (And yes the 9900KS is the gaming king!) And yes the 9900KS will beat the 3990X in PC gaming. Plus the 3990X is 280w Watts monster if your talking about TDP and the mortgage you need to take out to buy one lol.
> 
> We are still living in a world of 4 cores still where the money making entry level CPUs thrive your i5 & R5 series people.



Why would compare a 64 core CPU that WIndows 10 sees as 2 CPUs with a CPU that is for gaming. You should reference the 3800X instead and is the 9900KS that much faster period..............?


----------



## ARF (Feb 13, 2020)

ToxicTaZ said:


> Gaming does not benefit from more cores yet!!! Thus the 9900KS is the fastest Gaming CPU! (And yes the 9900KS is the gaming king!) And yes the 9900KS will beat the 3990X in PC gaming. Plus the 3990X is 280w Watts monster if your talking about TDP and the mortgage you need to take out to buy one lol.
> 
> We are still living in a world of 4 cores still where the money making entry level CPUs thrive your i5 & R5 series people.



Core i9-9900KS is fastest only if you game at 720p and 1080p with a RTX 2080 Ti.
If you game at 2160p with Radeon RX 5700 XT, there will be no difference.

So, it's better for you to buy the Ryzen 9 3900X that has 24 threads for everything beyond only gaming!


And yes, you have to run the 64-core Threadripper 3990X either under Windows 10 Enterprise or Linux.


----------



## bug (Feb 13, 2020)

ARF said:


> Core i9-9900KS is fastest only if you game at 720p and 1080p with a RTX 2080 Ti.
> If you game at 2160p with Radeon RX 5700 XT, there will be no difference.


It's still ~10% faster than a mid-range Ryzen even at QHD: https://www.techpowerup.com/review/intel-core-i9-9900ks/15.html
Granted, even a flat 10% isn't much of a difference. I mean, if Ryzen is unplayable at 30fps in a game you like, 9900KS' 33fps won't help you much.


----------



## GlacierNine (Feb 13, 2020)

ToxicTaZ said:


> Gaming does not benefit from more cores yet!!! Thus the 9900KS is the fastest Gaming CPU! (And yes the 9900KS is the gaming king!) And yes the 9900KS will beat the 3990X in PC gaming. Plus the 3990X is 280w Watts monster if your talking about TDP and the mortgage you need to take out to buy one lol.
> 
> We are still living in a world of 4 cores still where the money making entry level CPUs thrive your i5 & R5 series people.


Why are you comparing a 64 core workstation CPU to an 8 core Mainstream desktop CPU? Per-Core power draw on the 3990X is 1/3rd what the 9900KS pulls:


----------



## DeeJay1001 (Feb 13, 2020)

ARF said:


> I think it will be an epic fail and the only way to escape from that punishment is not to buy it. Ever.
> 
> Especially when you can buy the 65-watt Ryzen 7 3700X for $310.
> 
> ...



Who is pay $300+ for a 3700x? Microcenter had them last week for $259


----------



## ARF (Feb 13, 2020)

GlacierNine said:


> Why are you comparing a 64 core workstation CPU to an 8 core Mainstream desktop CPU? Per-Core power draw on the 3990X is 1/3rd what the 9900KS pulls:
> 
> View attachment 144743



In part the reason for this is that the best chiplets with lowest voltages go for the 3990X. 

Imagine if the best chiplets were used for the lower end models. Can be possible a 45-55-watt 3700X.


----------



## bug (Feb 13, 2020)

DeeJay1001 said:


> Who is pay $300+ for a 3700x? Microcenter had them last week for $259


Probably the tiny fraction of the world that doesn't live next to a Microcenter?


----------



## GlacierNine (Feb 13, 2020)

ARF said:


> In part the reason for this is that the best chiplets with lowest voltages go for the 3990X.
> 
> Imagine if the best chiplets were used for the lower end models. Can be possible a 45-55-watt 3700X.


Sure, but I'm mostly just trying to get Tax to engage brain for once instead of being... well, the way he is.

He's basically trying to compare a Mazda 787B to something designed for the Paris-Dakar. Sure, the Mazda will dominate around a racetrack like say, Suzuka, but the Dakar buggy will race in circumstances where the 787B won't even succeed in leaving the starting line. Pretending they're built for the same job is beyond stupid, it displays a total inability to actually think.


----------



## londiste (Feb 13, 2020)

GlacierNine said:


> Why are you comparing a 64 core workstation CPU to an 8 core Mainstream desktop CPU? Per-Core power draw on the 3990X is 1/3rd what the 9900KS pulls:


There is quite a lot wrong with just putting out that graph.
1. This is based off the whole system consumption. 112W/64 looks much better than 59W/8 especially is almost all of it has nothing to do with CPU.
2. 3.3 GHz vs 4.4 GHz. Zen2 does 3.4-3.5GHz at 2-3W which is an awesome result. At the same time it does 4.2GHz at about 10W. Any higher than that it goes all out of whack. Mine does 4.3 GHz at about 15W.
This is core only consumption.


----------



## GlacierNine (Feb 13, 2020)

londiste said:


> There is quite a lot wrong with just putting out that graph.
> 1. This is based off the whole system consumption. 112W/64 looks much better than 59W/8 especially is almost all of it has nothing to do with CPU.
> 2. 3.3 GHz vs 4.4 GHz. Zen2 does 3.4-3.5GHz at 2-3W which is an awesome result. At the same time it does 4.2GHz at about 10W. Any higher than that it goes all out of whack. Mine does 4.3 GHz at about 15W.
> This is core only consumption.


Sure, but is there any more wrong with that graph than there is with trying to claim that a 3990X and a 9900KS are competing products? Because that graph is more than good enough to prove the point I was making.


----------



## efikkan (Feb 13, 2020)

Mistral said:


> Can anyone please confirm if this comes boxed with a chiller or not?..


K-models don't, thankfully. 



Rob94hawk said:


> Base clock should have been 5 Ghz a long time ago. That being said lets see the benchmark comparisons.


And Netburst(Pentium 4) was supposed to reach 7-8 GHz…
The long-term trajectory for upcoming nodes is decreased clock speed, and all of these CPUs are pushing clocks into "throttle territory", so pushing clocks significantly higher is not feasable until different types of semi-conductors arrive on the market.

The way forward is higher IPC and more SIMD.



ARF said:


> The CPUs in the new consoles arriving later in 2020 will have 8 cores and 16 threads.
> Games can use as many threads as you throw at them, because there are areas like physics, AI acceleration, ray-tracing acceleration all of which will greatly benefit from many cores.


I think you missed the point. It's not about just spawning threads, but to actually use them for performance gains. Interestingly, all the things you mentioned could or should be accelerated on a GPU, not on a bunch of CPU cores.

The key to multithreaded scaling is dividing a workload into independent work chunks and let the cores chew at them, this is very difficult for games, as they are a pipeline of serial tasks, where only some smaller bits can be parallelized, but all of them needs to be synced up many times throughout the pipeline. For a game with a tick rate of e.g. 120 Hz, there is an 8.33ms window for the entire game simulation, usually starting with input event processing followed by game logic like collisions etc. If the game uses particle simulations etc. for effects, this has to come after the core game simulation, but before the rendering. The overhead of synchronizing threads grows with thread count, so there will be a point where you get into diminishing returns and not to mention stutter.

And when it comes to the rendering, there is really no point in throwing more threads at it, as the CPU only needs to keep the GPU busy. While it is possible to have multiple CPU threads build a single queue, there is no point to it as it will only create more latency on the driver side, not to mention synchronization problems. The only real purpose with multiple threads for rendering is to do different tasks, like having one thread to load resources while another is rendering. There will be very few real-world cases where more than 2-3 threads would be useful for the GPU in gaming.


----------



## ARF (Feb 13, 2020)

efikkan said:


> I think you missed the point. It's not about just spawning threads, but to actually use them for performance gains. Interestingly, all the things you mentioned could or should be accelerated on a GPU, not on a bunch of CPU cores.
> 
> The key to multithreaded scaling is dividing a workload into independent work chunks and let the cores chew at them, this is very difficult for games, as they are a pipeline of serial tasks, where only some smaller bits can be parallelized, but all of them needs to be synced up many times throughout the pipeline. For a game with a tick rate of e.g. 120 Hz, there is an 8.33ms window for the entire game simulation, usually starting with input event processing followed by game logic like collisions etc. If the game uses particle simulations etc. for effects, this has to come after the core game simulation, but before the rendering. The overhead of synchronizing threads grows with thread count, so there will be a point where you get into diminishing returns and not to mention stutter.
> 
> And when it comes to the rendering, there is really no point in throwing more threads at it, as the CPU only needs to keep the GPU busy. While it is possible to have multiple CPU threads build a single queue, there is no point to it as it will only create more latency on the driver side, not to mention synchronization problems. The only real purpose with multiple threads for rendering is to do different tasks, like having one thread to load resources while another is rendering. There will be very few real-world cases where more than 2-3 threads would be useful for the GPU in gaming.



You must use the CPU cores which are mostly sitting idle during gaming. In some type of crossfired-hybrid acceleration CPU-GPU.
What you want is that the GPU takes the whole load, while the CPU is completely offloaded.

You have 64-core 3990X and better start coding for it now.


----------



## ToxicTaZ (Feb 13, 2020)

At the end of the day.... The 10700K should sit somewhere in between 3800X and 9900KS performance wise.

Intel will keep the fastest Gaming CPU title until the 10900KS or AMD comes out with a 5GHz Ryzen 4000.....it's GHz that wins in gaming not cores for now.

You're market cores are based upon your entry level CPUs and both AMD and Intel are still making 4 & 6 cores CPUs. Telling everyone they need more than 8 cores CPU is Ludacris at the moment.... Until your bottom end i5 & R5 get 8 cores standard. 

This thread is Intel dual channel based should keep with in dual channel performance topics.


----------



## GlacierNine (Feb 13, 2020)

ToxicTaZ said:


> At the end of the day.... The 10700K should sit somewhere in between 3800X and 9900KS performance wise.
> 
> Intel will keep the fastest Gaming CPU title until the 10900KS or AMD comes out with a 5GHz Ryzen 4000.....it's GHz that wins in gaming not cores for now.
> 
> ...


There is absolutely no way in hell you are not an intentional troll.


----------



## kapone32 (Feb 13, 2020)

ToxicTaZ said:


> At the end of the day.... The 10700K should sit somewhere in between 3800X and 9900KS performance wise.
> 
> Intel will keep the fastest Gaming CPU title until the 10900KS or AMD comes out with a 5GHz Ryzen 4000.....it's GHz that wins in gaming not cores for now.
> 
> ...



Hmm I thought AMD got rid of 4 cores (APUs excluded) by 2nd gen there were no quad core the lowest was the 2600 and they are all hyper threaded. Not all games are based on clock speed and some games do gain from having multi cores (AOTS, Strange Brigade). The current thought process on gaming is 6 cores are the sweet spot. As a result no youtubers advise their watchers to use the I5 series.


----------



## efikkan (Feb 13, 2020)

kapone32 said:


> Not all games are based on clock speed and some games do gain from having multi cores (AOTS, Strange Brigade). The current thought process on gaming is 6 cores are the sweet spot.


There is no such thing as optimizing for clock speed. Any scalable code will scale with core performance, no matter how that performance is achieved. If clock speed was the thing that mattered, Bulldozer would be the king of gaming CPUs!

And as many people keep forgetting; per core performance and multi-core performance are not counterparts, in fact faster cores are important to scale further with multithreading, as there are always diminishing returns.


----------



## Vayra86 (Feb 13, 2020)

Yawn, Intel.


----------



## Kickus_assius (Feb 13, 2020)

Super XP said:


> This reminds me of the Pentium 4 days where Intel kept pushing higher clock speeds while AMD was innovating on CPU designs where AMD CPUs would beat P4's with up to 1000MHz lower clocks. That's how efficient and well designed the Athlon 64 was and beyond.



They would have been even more effective vs. Intel if they could have kept the temperatures down back then.  Once they finally got the die shrink rolled out with Athlon 64, Intel struck back with Conroe (Core 2 duo) and took the crown back.


----------



## JackCarver (Feb 13, 2020)

Depends on the use case as always. For my gaming rig I wouldn't buy a Ryzen CPU as Intel is still on the top in this use case and I doubt this will be changing the next time...5.3 GHz boost clock and you probably get it on all core boost speed as most Intel cpus are capable doing so. For under 400 it's in 3800X price range and easily outperforms 3900X/3950X in gaming.

The new gaming consoles all have 8 core cpus on board, so the next gen games will be all optimized for 8 cores, more isn't that necessary here. This cpu could really be best choice for gaming.


----------



## ppn (Feb 13, 2020)

Kickus_assius said:


> They would have been even more effective vs. Intel if they could have kept the temperatures down back then.  Once they finally got the die shrink rolled out with Athlon 64, Intel struck back with Conroe (Core 2 duo) and took the crown back.



This is happening again. Intel strikes back with Alderlake-S, and AMD is back at 10% market in no time.



JackCarver said:


> Depends on the use case as always. For my gaming rig I wouldn't buy a Ryzen CPU as Intel is still on the top in this use case and I doubt this will be changing the next time...5.3 GHz boost clock and you probably get it on all core boost speed as most Intel cpus are capable doing so. For under 400 it's in 3800X price range and easily outperforms 3900X/3950X in gaming.



Allcore on 5.3, sounds great but at 1.4 volts and 50% more power compared to 4.6.


----------



## JackCarver (Feb 13, 2020)

ppn said:


> Allcore on 5.3, sounds great but at 1.4 volts and 50% more power compared to 4.6.



Not necessarily as my 8700K, which has 4.7 single core boost, gets easily on 4.7 GHz all core with below 1.3V. I have it stable at 5 GHz all Core speed with 1.35V. With -2 AVX Offset. I would say you could get it with 1.3 to 1.35 all core boost, more wouldn't be necessary.
Good cooling solution assumed.


----------



## ToxicTaZ (Feb 13, 2020)

With Intel new Turbo 3.0 approach is it all cores or is it 1 or 2 cores...

My 9900KS is all cores where the 9900K is 2 cores... Rest of the 6 are 4.7GHz

10700K would be interesting if it was all cores 5.3GHz.... But I doubt it


----------



## JackCarver (Feb 13, 2020)

ToxicTaZ said:


> With Intel new Turbo 3.0 approach is it all cores or is it 1 or 2 cores...
> 
> My 9900KS is all cores where the 9900K is 2 cores... Rest of the 6 are 4.7GHz
> 
> 10700K would be interesting if it was all cores 5.3GHz.... But I doubt it



But if it is 1 or 2 cores you will be able to set in in BIOS to all Core boost. I think that many 9900K users have it at 5 GHz all core boost.


----------



## ToxicTaZ (Feb 13, 2020)

ppn said:


> This is happening again. Intel strikes back with Alderlake-S, and AMD is back at 10% market in no time.
> 
> 
> 
> Allcore on 5.3, sounds great but at 1.4 volts and 50% more power compared to 4.6.



Don't forget about Intel Meteor Lake there first 7nm on LGA 1700 socket PCIe 5.0...... Most likely vs AMD Zen 5 AM5 PCIe 5.0.....AM5 will stop future AM4 boards swop out upgrades first time in years.

Hopefully Intel Meteor Lake (7nm) is the big comeback.


----------



## X828 (Feb 13, 2020)

0nm  ...........     LOL.   they cant even get off 14nm.


----------



## JackCarver (Feb 13, 2020)

X828 said:


> 0nm ........... LOL. they cant even get off 14nm.



Sure Ice Lake is in 10nm...


----------



## ToxicTaZ (Feb 13, 2020)

JackCarver said:


> Sure Ice Lake is in 10nm...



10nm failed I get it! 10nm is dead. 

Icelake-S 10nm+ had bad yields so they had to convert it onto 14nm++ Comet Lake with unfinished memory controller and Q1 2020 is 6 months late to the market. The finished fixed memory controller is in Rocket Lake made by Samsung scheduled Q4 2020. Intel 10th generation is very short lived.

Yes they are going straight from 14nm to 7nm with brand new Fab42 factory. Fab42 is 7nm from the start then upgraded to 5nm/3nm/1.4nm

Intel Meteor Lake-S is there first product from Fab42

New Fab42 factory with Trump approval cost $7 Billion to build more than AMD is worth.

Intel 10th & 11th gens are just a gap fillers....I suggest not using LGA 1200 socket.... Wait for Meteor Lake on LGA 1700 socket PCIe 5.0 

So fanboys enjoy your less than 3 years window before your Extinction from the coming Intel Meteor lol


----------



## efikkan (Feb 13, 2020)

ToxicTaZ said:


> 10nm failed I get it! 10nm is dead.
> 
> Icelake-S 10nm+ had bad yields so they had to convert it onto 14nm++ Comet Lake with unfinished memory controller and Q1 2020 is 6 months late to the market. The finished fixed memory controller is in Rocket Lake made by Samsung scheduled Q4 2020. Intel 10th generation is very short lived.


Where did you get the information that Comet Lake is Sunny Cove?

Intel's 10nm might be very troubled and have struggled with poor yields (for now). But it's not dead, far from it. Ice Lake-Y and -U alone is probably close to or surpassing Zen 2 in shipped units (though Ice Lake only being low performance parts unfortunately). Many people fail to realize how incredible high volumes Intel ship on laptop and OEM parts.

But in terms of money spent, 10nm is certainly costly for Intel.


----------



## bug (Feb 13, 2020)

Kickus_assius said:


> They would have been even more effective vs. Intel if they could have kept the temperatures down back then.  Once they finally got the die shrink rolled out with Athlon 64, Intel struck back with Conroe (Core 2 duo) and took the crown back.


Well, doing a lot of work per clock tick will always be at odds with low temps.


----------



## Prima.Vera (Feb 14, 2020)

5.3 how many Cores again? 1 or 2?? At what Wattage??


----------



## Melvis (Feb 14, 2020)

Oh intel you crack me up! Here i thought when i saw the heading it was your 10C20T CPU but nope its just another 9900K that should be called the 9900KSM Edition.


----------



## TranceHead (Feb 14, 2020)

ARF said:


> The CPUs in the new consoles arriving later in 2020 will have 8 cores and 16 threads.
> Games can use as many threads as you throw at them, because there are areas like physics, AI acceleration, ray-tracing acceleration all of which will greatly benefit from many cores.
> 
> I can't wait to try the 64-core Threadripper 3990X or its derivatives shrunk to lower TDP envelopes.


Not all 8 cores are going to be for gaming.
One will be locked onto hypervisor duties, another one or two will be locked onto OS for seamless transition between game and GUI, maybe keep one as a keyholder/decryptor.
Consoles run completely different to PC, the 2 are not comparable.


----------



## ARF (Feb 14, 2020)

TranceHead said:


> Not all 8 cores are going to be for gaming.
> One will be locked onto hypervisor duties, another one or two will be locked onto OS for seamless transition between game and GUI, maybe keep one as a keyholder/decryptor.
> Consoles run completely different to PC, the 2 are not comparable.



That's not true. Consoles and PC run over the same basic platform - X86-64 CPU architecture and APIs found on Windows environment, too.

With next-gen consoles, we are going to see estimated 6X-8X CPU speed increase and between 95-110% higher GPU performance.

The Jaguar-based 4-module CPUs in PS4 and XBO will be completely obliterated.
It will be a new experience in the consoles world never seen before.

Which of course would mean that the requirements for the PC games will increase because they have always been more demanding and perhaps less optimised.

Read your homework - https://www.eurogamer.net/articles/...ilt-a-zen-2-navi-pc-to-next-gen-console-specs


----------



## TranceHead (Feb 14, 2020)

ARF said:


> That's not true. Consoles and PC run over the same basic platform - X86-64 CPU architecture and APIs found on Windows environment, too.
> 
> With next-gen consoles, we are going to see estimated 6X-8X CPU speed increase and between 95-110% higher GPU performance.
> 
> ...


No shit, consoles split resources differently.
Not all 8 CPU cores will be used on gaming.
PS4 had 8 jaguar cores, not all of them were used for gaming, it's resources were split.


----------



## Vayra86 (Feb 14, 2020)

JackCarver said:


> Not necessarily as my 8700K, which has 4.7 single core boost, gets easily on 4.7 GHz all core with below 1.3V. I have it stable at 5 GHz all Core speed with 1.35V. With -2 AVX Offset. I would say you could get it with 1.3 to 1.35 all core boost, more wouldn't be necessary.
> Good cooling solution assumed.



1.35v is also just about as far as you want to go and in reality you are running a 4.8 Ghz OC, not 5 Ghz. AVX offset effectively means that, because that is the hardest load scenario.

The chips that get 5 Ghz and up (6c12t or 8c and up) out of 1.3V are quite rare. Especially without an offset. It certainly isn't the norm.


----------



## Camper7 (Feb 14, 2020)

And what cooler is required to cool the CPU at 5.3 GHz? Or does it require an effective water cooler?


----------



## JackCarver (Feb 14, 2020)

Vayra86 said:


> 1.35v is also just about as far as you want to go and in reality you are running a 4.8 Ghz OC, not 5 Ghz. AVX offset effectively means that, because that is the hardest load scenario.
> 
> The chips that get 5 Ghz and up (6c12t or 8c and up) out of 1.3V are quite rare. Especially without an offset. It certainly isn't the norm.



That's true but if you only want to achieve all core boost speed, not beyond that, then it should be possible for most chips without AVX Offset. 4.7 GHz all core boost without AVX Offset wouldn't be a problem for my i7 8700K.


----------



## Vayra86 (Feb 14, 2020)

ARF said:


> That's not true. Consoles and PC run over the same basic platform - X86-64 CPU architecture and APIs found on Windows environment, too.
> 
> With next-gen consoles, we are going to see estimated 6X-8X CPU speed increase and between 95-110% higher GPU performance.
> 
> ...



Hang on. First you say the two aren't comparable and to illustrate that, you bring an article that *compares a PC with hardware equivalent to the new consoles*, to the current console crop?

Does not compute.

In reality, what we see is that current consoles are both CPU and GPU limited and that even similar spec PCs with a faster CPU can get a better overall experience out of it. So no, there isn't a big plus to running console games on a console anymore. The only thing is high quality optimization, but that really only happens for a handful of first party titles. The cheaper devs haven't got time for that at all, they dev a console version on the PC and outsource any ports.

That said we don't disagree. Yes, the CPU demands for games will go up a bit. But I think you will find that CPU and its threads in use for many other things besides gaming, just like on a PC. In fact, the console is fast becoming the device that does more multitasking than your typical gamer PC. It has more readily integrated social apps, recording, media applications, etc. On a PC those are services you can control yourself.

What Zen will really do for the next consoles is bring the CPU part of it back up to balance with the GPU. It was much needed. It allows the consoles to support 60hz/120hz gaming better and this is not a coincidence with 4K120hz capable OLED out. 30 FPS is rapidly becoming something that is viewed as subpar, even in mainstream, and as panel diagonals increase, it becomes much more visible to have low refresh rate. Not pleasant to look at.



Camper7 said:


> And what cooler is required to cool the CPU at 5.3 GHz? Or does it require an effective water cooler?



This'll do I think. Just don't use the auto OC setting please. Intel does not support overclocking.


----------



## JackCarver (Feb 14, 2020)

One thing said to core count:
The guys of PC Games Hardware tested the new TR for their gaming capability and although they are not completely useless for gaming like the old one, they are not gaming cpus. That said you can play games with the new TR but they are at best in the middle of gaming performance compared to all cpus. An 8 core/16 thread cpu is definitely sufficient for gaming.



Vayra86 said:


> This'll do I think. Just don't use the auto OC setting please. Intel does not support overclocking.



Nice one


----------



## ToxicTaZ (Feb 14, 2020)

Camper7 said:


> And what cooler is required to cool the CPU at 5.3 GHz? Or does it require an effective water cooler?



Any AIR or Liquid coolers rated minimal 200w for stock 10700K

For manual OC all cores 5.3GHz you need a good aftermarket open loop rated 400w say SWIFTtech or EK etc etc...


----------



## GlacierNine (Feb 14, 2020)

ToxicTaZ said:


> Any AIR or Liquid coolers rated minimal 200w for stock 10700K
> 
> For manual OC all cores 5.3GHz you need a good aftermarket open loop rated 400w say SWIFTtech or EK etc etc...


----------



## JackCarver (Feb 14, 2020)

ToxicTaZ said:


> rated 400w



You mean it goes that high??? My i7 8700K draws in OCCT Benchmark with AVX512 and small dataset, the worst Situation ever and not a real world Situation, about 190W. 400W is double that amount.


----------



## GlacierNine (Feb 14, 2020)

JackCarver said:


> You mean it goes that high??? My i7 8700K draws in OCCT Benchmark with AVX512 and small dataset, the worst Situation ever and not a real world Situation, about 190W. 400W is double that amount.


He's literally just making shit up. Ignore him.


----------



## ToxicTaZ (Feb 14, 2020)

JackCarver said:


> You mean it goes that high??? My i7 8700K draws in OCCT Benchmark with AVX512 and small dataset, the worst Situation ever and not a real world Situation, about 190W. 400W is double that amount.



Is your 8700K @5.3GHz+ AVX0 all cores?

That's what we are talking about here.

My 9900KS @stock needs standard 200w Air or Pre-filled liquid cooling. 10700K run just fine with any 200w cooling solution.

But OC the 9900KS AVX0 @5.2GHz+ its over 300w

So what's the 10700K heavy OC 5.3GHz+ AVX0 wattage? I would assume it would be similar?


----------



## JackCarver (Feb 14, 2020)

ToxicTaZ said:


> Is your 8700K @5.3GHz+ AVX0 all cores?



Wouldn't get that high. Need 1.39-1.4V to get it to 5 GHz all core Turbo without AVX Offset. No Chance to get it above 5 GHz without AVX offset. The next big problem here is my board, 200W and above and it fries my vrm mosfets   . But AVX2 scenarios with small data set aren't that often in real world. If I'm right Ryzen cpus don't even support AVX2.


----------



## efikkan (Feb 14, 2020)

JackCarver said:


> If I'm right Ryzen cpus don't even support AVX2.


Zen(1) supports AVX2, but with limited performance, due to fusing together two 128-bit AVX units. Zen 2 have two full AVX2 units, and much better performance.


----------



## Nkd (Feb 15, 2020)

I don’t understand why tech sites don’t mention this and keep stating this like Intel is pulling 5.3ghz all core with 125w tdp. Must be hard to mention that the tdp is all core on base clock and this shit is probably pulling 300+ watt at 5ghz plus. Must be hard stating the details about intel lol. 

I think close to 70% of the sites will just keep mentioning 125w TDP and bolstering the specs.


----------



## ToxicTaZ (Feb 15, 2020)

Nkd said:


> I don’t understand why tech sites don’t mention this and keep stating this like Intel is pulling 5.3ghz all core with 125w tdp. Must be hard to mention that the tdp is all core on base clock and this shit is probably pulling 300+ watt at 5ghz plus. Must be hard stating the details about intel lol.
> 
> I think close to 70% of the sites will just keep mentioning 125w TDP and bolstering the specs.



You can bolstering all you want but if your 105w CPU can't outperforming 125w CPUs at every day tasks and of course PC Gaming.... The jokes on AMD 

Already stock 9900KS 127w is already faster than stock 3800X 105w 

I suspect stk 10700K 125w is around stk 9900KS 127w performance for less money. 

Let's see if the 10700K is cheaper than the 3800X is the question?

I'm not sure if the 10700K will take away the top 8 cores performance crown away from the 9900KS


----------



## heflys20 (Feb 15, 2020)

Honestly, I'm not sure why anyone would buy a 9900ks or 3800x for Facebook browsing and basic (non $1000+ graphics card) gaming; but I digress.  I understand how things work.  Anyway, this news doesn't impress me, and I hope Intel prices these things accordingly.


----------



## ToxicTaZ (Feb 15, 2020)

heflys20 said:


> Honestly, I'm not sure why anyone would buy a 9900ks or 3800x for generic web browsing and basic (non $1000+ graphics card) gaming; but I digress.  I understand how things work.




Well I'm a PC Gamer 90% of the time and Intel 9900KS @5.2GHz is that fastest Gaming CPU at the moment with my RTX 2080 NVlink setup. 

I'm a performance guy while most people want easy cheap hardware. 

I'm custom builder.... EK is choice cooling... Most likely cost more than your entire setup....that what top end people do... My expensive Hobbies and PC Gaming.... Mostly 

We are in two different worlds.....like a guy with a Honda Civic talking to a guy that has a Bugatti


----------



## heflys20 (Feb 15, 2020)

Lol. The top one percent. Gotcha. Like said I understand how this works. Most revenue comes from us peasants tho, that's why I mention pricing. More like m8 grand coupe vs a Bugatti,  Imho (had to find the vehicle I was thinking of).  Unless we're saying my setup is worth $50-100. Lol.


----------



## R0H1T (Feb 15, 2020)

JackCarver said:


> You mean it goes that high??? My i7 8700K draws in OCCT Benchmark with AVX512 and small dataset, the worst Situation ever and not a real world Situation, about 190W. 400W is double that amount.


AVX512 doesn't work with 8700k, because 8700k doesn't support it, so unless you meant AVX2 your observation doesn't make sense. Also as you add more core, plus cache, with clocks above 5Ghz the power can rise exponentially with even mild OC above stock.


----------



## JackCarver (Feb 15, 2020)

R0H1T said:


> unless you meant AVX2



I meant AVX2, not AVX512, you‘re right. I don‘t think that it reaches 400W with mild oc but we will see when the first tests are out there.



ToxicTaZ said:


> We are in two different worlds.....like a guy with a Honda Civic talking to a guy that has a Bugatti



So good


----------



## voltage (Feb 15, 2020)

Super XP said:


> This reminds me of the Pentium 4 days where Intel kept pushing higher clock speeds while AMD was innovating on CPU designs where AMD CPUs would beat P4's with up to 1000MHz lower clocks. That's how efficient and well designed the Athlon 64 was and beyond.



and then they went to shiz... it then took amd decades to make something good again... so whats your point. you only focus on anything good amd does but not Intel. meh... short sited


----------



## ToxicTaZ (Feb 15, 2020)

voltage said:


> and then they went to shiz... it then took amd decades to make something good again... so whats your point. you only focus on anything good amd does but not Intel. meh... short sited



Then Intel came out Intel with Core 2 and changed the world! 

Let's see if Intel Meteor Lake will be Intel next Core 2?

Intel has been very quietly working in the background on 7nm and already spent over $10 Billion on it at the same time as there working on broken 10nm. (Fab42) 

Intel Meteor Lake is all new architecture on new 7nm finally on the road map. Looks like 2023 is launch year so AMD will enjoy the next 3 years until then. 

For now we have Comet Lake 10th generation with broken unfinished PCIe 4.0 memory controller and then Rocket Lake (made by Samsung) quick fix with working PCIe 4.0 memory controller both on LGA 1200 socket PCIe 4.0 board. 

Then we have Intel Alder (14nm+++ or 10nm++)?? And Intel Meteor Lake both on LGA 1700 socket PCIe 5.0 Motherboard. (DDR5 & USB-4) 

I'm stuck on my 9900KS till Meteor Lake so I can do another hand me down... People keep taking an complaining about TDP how it's getting higher... People get use to it!!! It's the new norm. Dual channel boards are over 100w TDP now from both and AMD Thread Ripper 280w+ are pushing 300w in Quad channel boards. Intel is talking about 500w GPU on 7nm...where Nvidia and AMD are 250w-300w range top end. 

I'm in here to see if there's any leaks performance on Intel 10700K just curious how it performs against my 9900KS..... The 10700K will be faster than the 3800X and a lower price being a i7 series.


----------



## heflys20 (Feb 16, 2020)

ToxicTaZ said:


> The 10700K will be faster than the 3800X and a lower price being a i7 series.



That wouldn't be surprising, considering the 3800x is essentially a 3700x with a higher TDP ( 65w vs 105=higher overclocking) and base clock. The difference between the two is a negligible (IMHO) 2-4%., when factoring in price.









						AMD Ryzen 7 3800X vs. 3700X: What's the Difference?
					

The latest series of Ryzen CPUs has been out for six weeks and yet only about a week ago were we able to get our hands on...




					www.techspot.com
				




Over here, the 9700k is about $100 (at most retailers) more than the 3700x; and that's before you buy a cooler.


----------



## ToxicTaZ (Feb 17, 2020)

heflys20 said:


> That wouldn't be surprising, considering the 3800x is essentially a 3700x with a higher TDP ( 65w vs 105=higher overclocking) and base clock. The difference between the two is a negligible (IMHO) 2-4%., when factoring in price.
> 
> 
> 
> ...



Well since the 10700K is replacing the 9700K one would expect around the same price range. 

Again the 8 cores 10700K will be faster across the board over the 3800X AMD best available 8 cores CPU. Intel won't be able to challenge AMD 3700X prices... That a fact and coming from an Intel guy. 

10700K is made to fight the 3800X with 9700K pricing. 

I'm betting the 10700K will be trading blows with my 9900KS.... Still waiting for leaks. 

The question now is the 10700K going to steel my 9900KS (World's Fastest Gaming CPU!) Title away from me?


----------



## Chrispy_ (Feb 17, 2020)

How attainable is that 5.3GHz boost?

Intel still has a duration limit on boost, before it drops back down to a lower state, right? It's either that or you disable the limits and the 95W TDP goes to hell and your board and cooling needs to cope with 250W of power to the socket :\

Additionally, there's the problem of attaining that speed with all cores. The 9900KS was an exception with advertised all core boost of 5GHz (assuming you could handle the >300W power draw) but am I right in thinking that standard K models still boost to different speeds depending on how many cores are loaded? I mean the 9900K was realistically a 4.4-4.7GHz chip. Getting it to 5GHz for more than an instant required stress-testing software that could commandeer all 16 Threads and intentionally lock 15 of those threads exclusively for itself at idle. In a real world scenario the background tasks of a modern OS kept at least 2 or 3 additional cores active, meaning that you'd almost never see the advertised 5GHz speed even running a single thread on an idle machine.


----------



## londiste (Feb 17, 2020)

Chrispy_ said:


> Intel still has a duration limit on boost, before it drops back down to a lower state, right? It's either that or you disable the limits and the 95W TDP goes to hell and your board and cooling needs to cope with 250W of power to the socket :\


Intel does not have duration limit on boost. It has duration limit on extended power limit.
Intel's Boost Clock (Technically Max Turbo Frequency) is maximum single core clock speed. Single core at 5.3GHz will not exceed 95W, I would hope.


Chrispy_ said:


> am I right in thinking that standard K models still boost to different speeds depending on how many cores are loaded?


This has been the case effectively since processors got more than 2 cores. All mainstream CPUs do exactly this - boost to different speeds depending on how many cores are loaded. Also, depending on load.


----------



## Chrispy_ (Feb 17, 2020)

londiste said:


> Intel does not have duration limit on boost. It has duration limit on extended power limit.
> Intel's Boost Clock (Technically Max Turbo Frequency) is maximum single core clock speed. Single core at 5.3GHz will not exceel 95W, I would hope.


Ah okay. Test show that the 9900K pulls about 65W from the socket when running single-threaded workloads with clocks averaging around 4.8Ghz as it bounces between 1-3 threads.


----------



## londiste (Feb 17, 2020)

Chrispy_ said:


> Test show that the 9900K pulls about 65W from the socket when running single-threaded workloads with clocks averaging around 4.8Ghz as it bounces between 1-3 threads.


Various tests show 9900K pulls anywhere from 30 to 40W for heavy single-core load (at boost clock - 5GHz).
I would suspect you got 65W from a review where whole system consumption was measured.


----------



## Super XP (Feb 19, 2020)

ToxicTaZ said:


> You can bolstering all you want but if your 105w CPU can't outperforming 125w CPUs at every day tasks and of course PC Gaming.... The jokes on AMD
> 
> Already stock 9900KS 127w is already faster than stock 3800X 105w
> 
> ...


Can we stop comparing a $1000 9900KS with a $430 3800X. Intel's rated TDPs are always calculated at the base clock, excluding any boost clocks. AMD rates its TDPs more with industry standards.
It's quite obvious by now AMD has far better processors over anything Intel has out to date.
Not to mention the massive amounts of security vulnerabilities Intel CPUs suffer from.



voltage said:


> and then they went to shiz... it then took amd decades to make something good again... so whats your point. you only focus on anything good amd does but not Intel. meh... short sited


AMDs been Innovating and pushing Technology in the CPU space for decades. Intel seems to have the monopoly mentality and a sort of arrogance to them, as they've gotten angry many times in the past when AMD releases competitive CPUs. Why else do you think Intel got charged Billions of dollars in damages for Anti Competition, Anti Consumerism and Anti Technology, all proven in multiple courts of Law. Anyhow, I said lots of good stuff about Intel's Conroe architecture. And a couple revisions above that. But then we find out that they took design shortcuts that resulted in over 250 security vulnerabilities, with new vulnerabilities popping up here and there. With most not fixed nor addressed.

Further with regards to AMD's push to Innovate, AMD really has no choice but to Innovate as they can't afford not to, Bulldozer, despite it being a innovation, set them back many years. Though they remained somewhat competitive on price/performance, they still fell behind. ZEN changed all that. If the Inquirer.net was still around, you could read a great article that was called Where AMD Leads, Intel Follows. Because that is what has been happening for at least 25+ years, AMD leads the industry and Intel closely follows behind. At least in the Desktop and Server space.

Many people in the industry already know this. 
Anyhow I ain't going to debate this as this is FACT based information.


----------



## ToxicTaZ (Feb 20, 2020)

Super XP said:


> Can we stop comparing a $1000 9900KS with a $430 3800X. Intel's rated TDPs are always calculated at the base clock, excluding any boost clocks. AMD rates its TDPs more with industry standards.
> It's quite obvious by now AMD has far better processors over anything Intel has out to date.
> Not to mention the massive amounts of security vulnerabilities Intel CPUs suffer from.
> 
> ...



The only real fact here is that the 9900KS is faster than your 3800X... You can quote prices till your dead!

At the end of the day Intel makes a better 8 cores CPU at the moment!

Boom both 9900KS and 10700K are faster than AMD "BEST" 8 cores CPU!

By the way all migrants are hardware Intergraded into Stepping 13 


 Funny how people would rather be with Lemmings or Sheep before being a prodigy these days!


----------



## Chrispy_ (Feb 20, 2020)

ToxicTaZ said:


> The only real fact here is that the 9900KS is faster than your 3800X... You can quote prices till your dead!
> 
> At the end of the day Intel makes a better 8 cores CPU at the moment!
> 
> ...


You do realise that you can buy a 3950X and use its 16 cores against a 9900KS, right?

The 3950X trades blows with the 9900KS in single-threaded matches - it's 4.7GHz/64MB cache vs 5.0GHz/16MB cache and the Ryzen 9 responds better to fast memory than the Intel - something anyone who is buying >$500 chips should be able to afford. Once you need more than a single thread, the AMD runs away with all the victories. More performance, lower power draw, and more PCIe bandwidth. It's a win-win-win, and I'm not even considering all the security vulnerabilities plaguing the faulty Intel platform, either. Intel may be adding mitigations in hardware but as fast as they patch one problem, five more spring up. Their architecture is so old and flawed that it's an easy target. Their security problems won't go away until they actual make a truly new architecture that isn't yet another patched-up Skylake!

Sure, the 9900KS is a very fast _8-core_ CPU, but if you need multi-threaded performance, "only" 8 cores is embarrassingly weak and if you don't need multi-threaded performance, then the 9900KS is 300+ wasted dollars that could (and should) be spent on better GPU and RAM instead. Let's face it - AMD is up to 64 cores now on consumer platforms and their advantage is growing rapidly whilst Intel seem to be floundering around in a mess born of their own complacency - blaming 10nm complications as the sole scapegoat for their multiple failings over the last half-decade. I'd pity them but they don't deserve any pity because they've bribed and cheated their way to the top and ripped us all off in the process.


----------



## Super XP (Feb 20, 2020)

ToxicTaZ said:


> *The only real fact here is that the 9900KS is faster than your 3800X... You can quote prices till your dead!*
> 
> At the end of the day Intel makes a better 8 cores CPU at the moment!
> 
> ...


Don't flatter the very miniscule Single Threaded performance advantage Intel has. That sucks A LOT more power and is one of the worst Price/ Performance ratio on the planet. Not to mention it's efficiency simply stinks.

When AMD launches ZEN3, Intel will lose that miniscule Single Threading performance advantage. Then what are they going to to do? Come out with a 9980KS with even higher clocks that needs 400W to run and call it a 125W TDP?


& FYI my original post is based on Facts...


----------



## ToxicTaZ (Feb 20, 2020)

Chrispy_ said:


> You do realise that you can buy a 3950X and use its 16 cores against a 9900KS, right?
> 
> The 3950X trades blows with the 9900KS in single-threaded matches - it's 4.7GHz/64MB cache vs 5.0GHz/16MB cache and the Ryzen 9 responds better to fast memory than the Intel - something anyone who is buying >$500 chips should be able to afford. Once you need more than a single thread, the AMD runs away with all the victories. More performance, lower power draw, and more PCIe bandwidth. It's a win-win-win, and I'm not even considering all the security vulnerabilities plaguing the faulty Intel platform, either. Intel may be adding mitigations in hardware but as fast as they patch one problem, five more spring up. Their architecture is so old and flawed that it's an easy target. Their security problems won't go away until they actual make a truly new architecture that isn't yet another patched-up Skylake!
> 
> Sure, the 9900KS is a very fast _8-core_ CPU, but if you need multi-threaded performance, "only" 8 cores is embarrassingly weak and if you don't need multi-threaded performance, then the 9900KS is 300+ wasted dollars that could (and should) be spent on better GPU and RAM instead. Let's face it - AMD is up to 64 cores now on consumer platforms and their advantage is growing rapidly whilst Intel seem to be floundering around in a mess born of their own complacency - blaming 10nm complications as the sole scapegoat for their multiple failings over the last half-decade. I'd pity them but they don't deserve any pity because they've bribed and cheated their way to the top and ripped us all off in the process.



What kinda drug fantasy are you on? If you like it or not stock 9900KS still holding the top record of fastest Gaming CPU! No competition from AMD best 8 cores CPU 3800X or from any R9 series for that matter.....

LOL you talk about 3950X as it is extremely expensive especially to AMD people that talk all day long about price price price I can't afford anything price price and can't afford anything. You talking about 3990X is even more ridiculous expensive trying to quote price price price all day!

Nothing is touching my 9900KS @5.2GHZ (30%OC) Cooled by EK in gaming thus is what I do 90% of the time on my custom Gaming RIG.

You say get a better GPU and RAM?? Are you on drugs? I'm running RTX 2080 NVlink setup with XMP 4133MHz CL17-17-17-37 Ultra low latency RAM! Well above Any AMD GPUs and dual channel platform in gaming!

Sorry AMD doesn't have a better 8 cores than the 3800X for now. Maybe AMD 4000 series will have a better 8 cores CPU then the 10700K?

10700K will out perform the 3800X for the 9700K price.

I would love to see the 3800X run 5GHz clock speeds just Emagine 3800X with the power and throttling heat nightmare if AMD had 5GHz tech lol....but unfortunately bound to the max 4.3GHz OC if your very lucky.



Super XP said:


> Don't flatter the very miniscule Single Threaded performance advantage Intel has. That sucks A LOT more power and is one of the worst Price/ Performance ratio on the planet. Not to mention it's efficiency simply stinks.
> 
> When AMD launches ZEN3, Intel will lose that miniscule Single Threading performance advantage. Then what are they going to to do? Come out with a 9980KS with even higher clocks that needs 400W to run and call it a 125W TDP?
> 
> ...



Intel has Rocket Lake to deal with Zen 3 and yes Rocket Lake is 125w TDP what's your point? TDP means nothing....just like your Tread Rippers now pushing almost 300w TDP going off your statement.

Yeah your right I wouldn't be surprised if Intel came out with the 10900KS lol.... Would be amazing

Intel had no competition from AMD from the 2700K to 10700K in the PC Gaming department going on a decade now.


----------



## Super XP (Feb 20, 2020)

Intel has no competition from AMD in PC Gaming you say? 
I recommend you stop drinking OK


----------



## GlacierNine (Feb 20, 2020)

Two completely blind fanboys screaming at each other. What a fantastic thing the internet has brought to us.


----------



## Super XP (Feb 20, 2020)

GlacierNine said:


> Two completely blind fanboys screaming at each other. What a fantastic thing the internet has brought to us.


You call a constructive conversation fanboy screaming? Interesting...
I learned something from ToxicTaz as I'm sure he learned something from me. Forums are here for discussion.


----------



## ToxicTaZ (Feb 20, 2020)

Super XP said:


> You call a constructive conversation fanboy screaming? Interesting...
> I learned something from ToxicTaz as I'm sure he learned something from me. Forums are here for discussion.




Here here well said!


----------



## GlacierNine (Feb 20, 2020)

Super XP said:


> You call a constructive conversation fanboy screaming? Interesting...
> I learned something from ToxicTaz as I'm sure he learned something from me. Forums are here for discussion.


I dunno about that. You both seemed absolutely determined not to actually absorb each other's points.

For example, you both tried to claim that your respective manufacturer's TDP was somehow less than the others. You stated: "Intel's rated TDPs are always calculated at the base clock, excluding any boost clocks. AMD rates its TDPs more with industry standards. "

Taz proceeded to then state "Intel has Rocket Lake to deal with Zen 3 and yes Rocket Lake is 125w TDP what's your point? TDP means nothing....just like your Tread Rippers now pushing almost 300w TDP going off your statement."

The problem being that:

1 - There are no industry standards on what TDP means. Gamer's Nexus has a great in depth article here: https://www.gamersnexus.net/guides/...lained-deep-dive-cooler-manufacturer-opinions

2 - Taz's statement first dismisses the importance of TDP entirely, and then in the very same statement he uses TDP as a means by which to attack your point.

You're both at best half right and there's certainly no indication either of you is listening to what the other has to say.


----------



## londiste (Feb 20, 2020)

There is an industry standard idea of TDP - the maximum amount of heat component generates. As the name says the purpose for it is thermal design - component cooling should have a specific maximum value to be able to handle.


----------



## bug (Feb 20, 2020)

londiste said:


> There is an industry standard idea of TDP - the maximum amount of heat component generates. As the name says the purpose for it is thermal design - component cooling should have a specific maximum value to be able to handle.


Care to link to this supposed standard?
Because TDP is not as simple as you think it is.


----------



## londiste (Feb 20, 2020)

What makes it not simple? There is a piece of something - usually metal because that has good thermal conductivity and ability to withstand high temperatures - and that piece radiates heat. The rate of heat transfer is measured or represented in watts (W, a joule of heat in one second). TDP is the maximum designed amount of heat to be radiated from it. This is quite useful for figuring out its need for cooling. Now, this has little to do specifically with CPUs or semiconductors at this point.

In case of a chip, radiated heat for all practical purposes (techically - roughly) equals its power consumption.

When it comes to CPUs, marketing and brand politics comes into play and disrupts process of normal spec creation. Both AMD and Intel currently have convoluted and complex TDP definitions on purpose (or should I say for marketing purposes) involving thermally significant periods and useful work. Surprisingly, at the same time GPUs do adhere to and limit themselves to TDP quite precisely.


----------



## bug (Feb 20, 2020)

londiste said:


> What makes it not simple? There is a piece of something - usually metal because that has good thermal conductivity and ability to withstand high temperatures - and that piece radiates heat. The rate of heat transfer is measured or represented in watts (W, a joule of heat in one second). TDP is the maximum designed amount of heat to be radiated from it. This is quite useful for figuring out its need for cooling. Now, this has little to do specifically with CPUs or semiconductors at this point.
> 
> In case of a chip, radiated heat for all practical purposes (techically - roughly) equals its power consumption.
> 
> When it comes to CPUs, marketing and brand politics comes into play and disrupts process of normal spec creation. Both AMD and Intel currently have convoluted and complex TDP definitions on purpose (or should I say for marketing purposes) involving thermally significant periods and useful work. Surprisingly, at the same time GPUs do adhere to and limit themselves to TDP quite precisely.


So basically you don't care to link to that standard. Got it.


----------



## londiste (Feb 20, 2020)

bug said:


> So basically you don't care to link to that standard. Got it.


This is a cop-out and you know it.
Did you read what I wrote?


----------



## bug (Feb 20, 2020)

londiste said:


> This is a cop-out and you know it.


What is a cop-out? You claimed there is an industry standard, show it to me.


----------



## londiste (Feb 20, 2020)

Industry standard does not always mean a document. It also means a kind of established common-sense in an industry.
Again, what would you say was wrong in my post?


----------



## bug (Feb 20, 2020)

londiste said:


> Industry standard does not always mean a document. It also means a kind of established common-sense in an industry.



Not only is an industry standard always a document, it's also a vetted one.
Basically, you're wishing your common sense was an industry standard.


londiste said:


> Again, what would you say was wrong in my post?


It's incomplete, it paints a truncated picture.


----------



## londiste (Feb 20, 2020)

What kind of truncated picture? Could you please elaborate?



bug said:


> Not only is an industry standard always a document, it's also a vetted one.
> Basically, you're wishing your common sense was an industry standard.


OK, we seem to have a linguistic argument here. Sorry, English is not my first language. I could have sworn industry standard is used for other meanings than strictly vetted documents.


----------



## bug (Feb 20, 2020)

londiste said:


> What kind of truncated picture? Could you please elaborate?


I can. But I will just go get some sleep instead.



londiste said:


> What kind of truncated picture? Could you please elaborate?


Ok, here goes: you're disregarding change and thinking of a stationary system.

In practice, if you design a system to be able to sustain 100W TDP, the heatsink, if cool enough, will be able to handle, say 200W for a limited amount of time. It will get hot, but while it can still absorb heat, it will do its job. This is what you see with Intel systems when you conclude their TDP definition is wrong. But the thing is, they can't know how far the CPU will go, because you may have bought a 150W capable heatsink for your system. Or even a 200W heatsink. There's a limit in the silicon, too, that's where Intel sets the cutoff, but until you reach there, the CPU is allowed to go crazy. But how crazy, that will vary from system to system, Intel cannot guarantee that.


londiste said:


> OK, we seem to have a linguistic argument here. Sorry, English is not my first language. I could have sworn industry standard is used for other meanings than strictly vetted documents.


It's not my first language either and you're not entirely wrong. There are unratified standards, they are called "de-facto standards" (the others are "de-jure standards"). But when you're talking about de-facto standards, you need to refer to them as such. Also, de-facto standards are seldom binding (e.g. JPEG is the de-facto standard for images on the web, yet there's plenty of other formats used for the same purpose).


----------



## londiste (Feb 21, 2020)

bug said:


> Ok, here goes: you're disregarding change and thinking of a stationary system.
> 
> In practice, if you design a system to be able to sustain 100W TDP, the heatsink, if cool enough, will be able to handle, say 200W for a limited amount of time. It will get hot, but while it can still absorb heat, it will do its job. This is what you see with Intel systems when you conclude their TDP definition is wrong. But the thing is, they can't know how far the CPU will go, because you may have bought a 150W capable heatsink for your system. Or even a 200W heatsink. There's a limit in the silicon, too, that's where Intel sets the cutoff, but until you reach there, the CPU is allowed to go crazy. But how crazy, that will vary from system to system, Intel cannot guarantee that.


Dynamic nature of the load should not be that much of a problem, spec for maximum. 

The thing with CPUs as well as other modern semiconductors is that TDP is almost never about the silicon capability - everything is power limited anyway. Assuming (and using) potential additional cooling capacity by default is something I would not say is OK. While that is the official reasoning it does not sound very sincere. It is the "thermally useful" and "useful load" thing right there. With a heavy load (which on lower and midrange CPUs even gaming can provide these days) the boost received is very minor if at all. On the other hand, it helps a lot with benchmarks.

Edit: that last part is assuming the over-TDP part is temporary and uses oversized cooler's ability to absorb heat which is what the theory says. This does seem to be the case for Intel non-K CPUs today but not for Intel K CPUs or Ryzen 3000s, last two will boost happily beyond TDP.

Heatsink TDP has proven to be even much more bullshit than CPUs are, even though it should be simpler when the dynamic variable part of a heatsink is generally limited to temperature and fan speed.


----------



## bug (Feb 21, 2020)

londiste said:


> Dynamic nature of the load should not be that much of a problem, spec for maximum.
> 
> The thing with CPUs as well as other modern semiconductors is that TDP is almost never about the silicon capability - everything is power limited anyway. Assuming (and using) potential additional cooling capacity by default is something I would not say is OK. While that is the official reasoning it does not sound very sincere. It is the "thermally useful" and "useful load" thing right there. With a heavy load (which on lower and midrange CPUs even gaming can provide these days) the boost received is very minor if at all. On the other hand, it helps a lot with benchmarks.
> 
> ...


Let's make this simple: Intel tells you what the CPU would pull under constant load and under (some) bursty conditions. They could hard-cap power there, but instead they let systems with beefier cooling to draw more power, as long as they don't get too hot. This last part is not quantifiable, so they cannot put a number on it. If Intel did what you suggest, like they've done for years, there would be some untapped potential in your CPU.


----------



## londiste (Feb 21, 2020)

- Intel tells you what the CPU would pull under constant load and silently gives you another number for bursty conditions (+25% for 8 seconds). In a silent agreement with motherboard manufacturers limits are not always imposed on K-series CPUs (or use something inane like 210W for 9900K) letting them pull more than they are supposed to for longer than they should.
- AMD does seem to have a proper power limit in place (at 35% more than the stated TDP value for Ryzen 3000). Additional headroom is available as PBO.

Yes, both state some reliance on cooler capabilities but in practice insufficient cooler just triggers thermal throttle and calms CPU down with that. AMD does have dynamic boost mechanism based on temperatures but it does not make that much of a difference in practice.


----------



## bug (Feb 21, 2020)

londiste said:


> - Intel tells you what the CPU would pull under constant load and silently gives you another number for bursty conditions (+25% for 8 seconds). In a silent agreement with motherboard manufacturers limits are not always imposed on K-series CPUs (or use something inane like 210W for 9900K) letting them pull more than they are supposed to for longer than they should.
> - AMD does seem to have a proper power limit in place (at 35% more than the stated TDP value for Ryzen 3000). Additional headroom is available as PBO.
> 
> Yes, both state some reliance on cooler capabilities but in practice insufficient cooler just triggers thermal throttle and calms CPU down with that. AMD does have dynamic boost mechanism based on temperatures but it does not make that much of a difference in practice.


Yeah, well, when the architectures are so different, power draw will act differently. What can you do?


----------



## Chrispy_ (Feb 21, 2020)

ToxicTaZ said:


> You say get a better GPU and RAM?? Are you on drugs?


Well you are GPU-limited in most games. You bought a $600 CPU and $700 RTX2080 for $1300.
You would have better gaming performance on a $300 CPU and a $1000 RTX 2080Ti for the same $1300.

It really is that simple;

Whether it's a 9700K or a 3800X doesn't really matter - A 2080Ti is so much faster than a vanilla 2080 so there's GPU performance you denied yourself by spending that money on an overpriced CPU instead.


----------



## ToxicTaZ (Feb 21, 2020)

londiste said:


> - Intel tells you what the CPU would pull under constant load and silently gives you another number for bursty conditions (+25% for 8 seconds). In a silent agreement with motherboard manufacturers limits are not always imposed on K-series CPUs (or use something inane like 210W for 9900K) letting them pull more than they are supposed to for longer than they should.
> - AMD does seem to have a proper power limit in place (at 35% more than the stated TDP value for Ryzen 3000). Additional headroom is available as PBO.
> 
> Yes, both state some reliance on cooler capabilities but in practice insufficient cooler just triggers thermal throttle and calms CPU down with that. AMD does have dynamic boost mechanism based on temperatures but it does not make that much of a difference in practice.



Stock 9900KS is less than 200w @stk

Any 200w air or liquid cooler works fine. 

10700K is using less power then my 9900KS with similar performance just above the 3800X



Chrispy_ said:


> Well you are GPU-limited in most games. You bought a $600 CPU and $700 RTX2080.
> You would have better gaming performance on a $300 CPU and a $1000 RTX 2080Ti
> 
> It really is that simple. Whether it's a 9700K or a 3800X doesn't really matter. You don't have a 2080Ti so there's performance left on the table that you chose to plough into your CPU that would have served you better on the GPU instead.



Did you read what I said? “RTX 2080 NVlink" to unknowledge people that's "2080 SLI" and yes all my SLI profiles are working....SLI takes alot of work to work....... But performance wise 2080 SLI is faster than 2080Ti 

3840x1600 is what I'm driving (LG UltraGear 38GL950G-B) if you're concerned. 

Black Friday 2018 my RTX 2080 NVlink setup was $200. Less then one RTX 2080Ti 

Playing COD Modern Warfare at the moment maxed out 120+fps with Raytracing off..... With Raytracing on its around 85+fps hit.


----------



## GlacierNine (Feb 21, 2020)

londiste said:


> - Intel tells you what the CPU would pull under constant load and silently gives you another number for bursty conditions (+25% for 8 seconds). In a silent agreement with motherboard manufacturers limits are not always imposed on K-series CPUs (or use something inane like 210W for 9900K) letting them pull more than they are supposed to for longer than they should.
> - AMD does seem to have a proper power limit in place (at 35% more than the stated TDP value for Ryzen 3000). Additional headroom is available as PBO.
> 
> Yes, both state some reliance on cooler capabilities but in practice insufficient cooler just triggers thermal throttle and calms CPU down with that. AMD does have dynamic boost mechanism based on temperatures but it does not make that much of a difference in practice.


Not how TDP works.
Let's pretend you have a heatsink large enough to keep a heat-emitting object at 50C in a 20C room, and lets say that you're dealing with 100W in those circumstances.

The exact same heatsink with absolutely no changes, will also be able to keep a more substantial heat emitting object, at a steady temperature somewhere upwards of 50C (Since temperature scales are arbitrary, it won't be just twice as many C) in a 20C room, and it will be dissipating, say, 200W of heat the entire time it does so.

So is this imaginary heatsink a 100W TDP heatsink or a 200W TDP heatsink? It can dissipate both figures as long as you're prepared to accept a higher delta over ambient, but claiming either one is the Rated TDP of the cooler is meaningless unless you are also specifying a temperature delta over ambient.

No manufacturer provides this info.


----------



## Chrispy_ (Feb 21, 2020)

ToxicTaZ said:


> Did you read what I said? “RTX 2080 NVlink" to unknowledge people that's "2080 SLI" and yes all my SLI profiles are working....SLI takes alot of work to work....... But performance wise 2080 SLI is faster than 2080Ti


Ewww! SLI...
I'll quote TechPowerup, since you're using these forums:

"So the burning question: should you spend $1,680 on RTX 2080 SLI? *Absolutely not*. Averaging all our tests, RTX 2080 SLI is within single-digit percentage performance of the RTX 2080 Ti"

Since that article was written 18 months ago, even fewer games have decent SLI support - Nvidia have dropped SLI for nearly all of their cards so most developers aren't bothering to optimise for it.

I mean, if you're happy - great, but I certainly wouldn't go around preaching its merits.


----------



## londiste (Feb 21, 2020)

bug said:


> Yeah, well, when the architectures are so different, power draw will act differently. What can you do?


Either set the TDP to the where the power limit is or limit the power where TDP is set to. Different architectures do not really play a large part in this.


ToxicTaZ said:


> Stock 9900KS is less than 200w @stk


At stock, yes it is. It tends to have a measured power consumption of 160-170W. 
Anandtech noted in their 9900K review that PL2 for it is set to 210W and the same has been found by other reviewers with different motherboards. Note that this is PL2 - power limit for the temporary boost. Based on Intel's own documentation, this should last 8 seconds at maximum and be 125% TDP. In theory.


----------



## bug (Feb 22, 2020)

GlacierNine said:


> Not how TDP works.
> Let's pretend you have a heatsink large enough to keep a heat-emitting object at 50C in a 20C room, and lets say that you're dealing with 100W in those circumstances.
> 
> The exact same heatsink with absolutely no changes, will also be able to keep a more substantial heat emitting object, at a steady temperature somewhere upwards of 50C (Since temperature scales are arbitrary, it won't be just twice as many C) in a 20C room, and it will be dissipating, say, 200W of heat the entire time it does so.
> ...



Not really. If a heatsink is designed to handle 100W, that's how much it will dissipate continuously. If you feed it 200W constantly, if will start heating up faster than it can dissipate heat and temps will spiral up from there. Luckily, CPUs are designed to step back from dissipating 200W when they detect this.
So no, the CPU will not get just slightly warmer if it goes beyond what the heatsink is designed to handle.


londiste said:


> Either *set the TDP to the where the power limit is* or limit the power where TDP is set to. Different architectures do not really play a large part in this.


Man, have I been talking to myself all this time? The power limit is dictated by the CPU, the heatsink and the airflow in your case. How would you put a number on that as the manufacturer of the CPU?


----------



## Super XP (Feb 22, 2020)

A great rule of thumb would be to use the max power limit the CPU gets when its clocks are boosted. That is how they should determine the TDP. I know from the past and present that AMD's TDP are more accurate over Intel's.


----------



## bug (Feb 22, 2020)

Ok, another one that's decided to be immune to reason. I give up.

For reference: no matter how you define TDP, the heatsink will allow the CPU to dissipate more than that _for a short period of time_. If you move the TDP to include the bursty TDP, you'll need a bigger heatsink. A bigger heatisnk will, in turn allow the CPU to boost even higher for a time. Rinse and repeat.


----------



## ToxicTaZ (Feb 22, 2020)

Super XP said:


> A great rule of thumb would be to use the max power limit the CPU gets when its clocks are boosted. That is how they should determine the TDP. I know from the past and present that AMD's TDP are more accurate over Intel's.



Isn't the TDP based upon base clocks is it not? Thus we are always trying to find out what Turbo clocks TDP is...

200w Air or Liquid coolers are standard now days.

I'm running Custom Push-Pull open-loop 420mm Rad with D5 pumps... Cooled by EK setup....that could handle 400w CPU cooling np.

Some guys have PC Chillers and they can handle 800w.... For people that want to run sub-zero temperatures...


----------



## londiste (Feb 22, 2020)

bug said:


> Man, have I been talking to myself all this time? The power limit is dictated by the CPU, the heatsink and the airflow in your case. How would you put a number on that as the manufacturer of the CPU?


Power limit is set in BIOS.
As a CPU manufacturer why would you care about heatsink and airflow? Manufacturer says this CPU emits x Watts of heat. The rest of it is up to OEMs and system builders.


bug said:


> For reference: no matter how you define TDP, the heatsink will allow the CPU to dissipate more than that _for a short period of time_.


That short period of time is 1. misleading and 2. not what we see from Intel and AMD today in many cases.


bug said:


> If you move the TDP to include the bursty TDP, you'll need a bigger heatsink. A bigger heatisnk will, in turn allow the CPU to boost even higher for a time. Rinse and repeat.


How would you spec something like this?


----------



## Super XP (Feb 24, 2020)

ToxicTaZ said:


> Isn't the TDP based upon base clocks is it not? Thus we are always trying to find out what Turbo clocks TDP is...


It shouldn't be based on base clocks. Because when you go into Boost Mode or Turbo Mode, more voltage is used to get that boosted speed. 
You need an appropriate cooler that can handle the very MAX boost clock or you run into overheating issues. You always design something for the maximum.

If you buy a CPU with TDP rated at 90 but in boost it goes to 200. If you followed that TDP rating for your CPU cooler purchase, you will overheat your processor.


----------



## John Naylor (Feb 29, 2020)

What is the fascination with "alleged design superiority" if it doesn't deliver the goods ?  "Our racing engine is a superior design and our transmission is highly innovative ... so what, if you came in 2nd none of that matters.   It doesn't matter how close it was, the guy who crosses the finish line 1st take home the big cash prize.

Who can name the silver medalist from the last Olympics ?   Its the Gold medal winner that gets their name remembered, it's the Gold medal winner that gets their pic on the wheaties box.     I don't care if Option B is thought to have a better design ... I don't have reason to care if it's faster at doing things neither myself nor 98%  other PC users do.  I can not justify a $500 3900x for a pure gaming box when TPU test results show a $230 9600k outperforming it.

Forget cores, forget die size, no matter who makes the CPU, no matter how many cores, how many threads, how many teraflops, whatever.... arguing abut design philosophy is not reasoning, it's making excuses.  Only one thing matters, what lets you work and play faster.

Is cost a factor ... yes ofcourse.  Is CPU cost in and of itself a factor ?  No.   You are not comparing cost of a $250 CPU and a $300 CPU, thats a red herring.  Your  investment in "the box" iss what's relevant.  If you are willing to pay $1.800 for a box that's faster instead of $1,750, then that investment has no ROI if it's not more than 2.8% faster  But if you earn a living on that box say as a CAD Operator charging $100 an hour .... then you pay for that investment in about 2 days .... every day after that you are making money.

When ya client asks why your competition completed the project for less money and in less time than you did, explaining that your CPU has a better design and it can do many things faster but it's slower in CAD, it's not going to cut any juice.  Use the best tool for each job.  In my field... CAD operators doing straight 2D and 3D CAD use Intel Boxes with GTX / RTX Graphics ... when that finished work is rendered, it's done  on AMD CPUs with more cores and Quadro graphics.  There is no tool in any field that id best for every task.... the best tool if the one that finishes the job in the quickest manner.  How new or innivative the design is may be dividends for the manufacturer... but it doesn't mean beans to the user.  What i the best tool in ta toolbox ? .. the screwdriver, the hammer or the wrench ... the answer depends on whether your current need is to put in a screw, bang in  a nail or tighten a bolt.  If you never bang in nails, then no matter how good the hammer's design is , this tool is not the best choice for tout work


----------



## Super XP (Feb 29, 2020)

John Naylor said:


> I don't care if Option B is thought to have a better design ... I don't have reason to care if it's faster at doing things neither myself nor 98%  other PC users do.  I can not justify a $500 3900x for a pure gaming box when TPU test results show a $230 9600k outperforming it.
> *WHAT? LOL* How new or innivative the design is may be dividends for the manufacturer... but it doesn't mean beans to the user.


First of all the majority of gaming benchmarks online show the 3900x beating the 9600k. But why would anybody compare a monster 3900x with a 9600k? That ZEN2 CPU is way far ahead of it in everything you throw at it. A better comparison and similarly priced CPU would the the 3600x 6Core/12Thread that competes well against the 9600k 6Core/6Thread in PC Gaming and offers users an upgrade pathway.
And the argument that Intel is still better at gaming is a age old argument that no longer holds water. Unless people consider a couple FPS swings as a reason to justify buying Intel CPUs that are going EOL.




> Innovation, CPU design technological advancements, efficient manufacturing process ALL play a VITAL Role on a user choosing a CPU.
> FYI, without Innovation, you end up with garbage IPC increases per generation for overpriced CPUs. Look what happened to the PC Gaming industry after AMD released Bulldozer in 2011. 6 years later AMD releases innovative ZEN and revitalized the PC Gaming industry. And continues to do so with ZEN+, ZEN2 and soon to come ZEN3. All while Intel stagnated it as much as possible and selling CPUs for a very high undeserving premium.











						Intel Core i9 9900KS seems to be going EOL
					

We reviewed it back in November, the Intel Core i9 9900KS, Intels' premium processor running an all-core 5 GHz seems to have been going EOL as availability if the processor is dropping massively, an...




					www.guru3d.com
				












						Intel’s benchmarking antics questioned
					

Intel is cheating at benchmarks, we would say again but we have not seen any evidence they ever stopped.




					semiaccurate.com
				







__





						Intel pays for benchmarks, DECEIVES to make AMD look bad
					

Intel commissions Principled Technologies to benchmark Core i9 and AMD CPUs, has to LIE to look good.




					www.tweaktown.com
				











						Intel Caught Cheating, Gets a Slap on the Wrist 14 Years Later
					

If you bought a Pentium 4 CPU between 2000 and 2001 Intel owes you 15 bucks for fabricating benchmark results against AMD's Thunderbird CPU.




					wccftech.com
				











						AMD Calls Out Intel, BAPco For Benchmark Cheating
					

Once upon a time, benchmarks have been the favorite "authority" when choosing products. But thanks to a scandal that rocked the mobile world a few…




					www.slashgear.com
				











						Intel Performance Strategy Team Publishing Intentionally Misleading Benchmarks
					

We checked the Intel performance strategy team's latest benchmarks and found that they used an intentionally misleading test to back their claims




					www.servethehome.com
				




Performance Strategy Team?
That spells deep desperation for them.


----------



## ToxicTaZ (Feb 29, 2020)

Super XP said:


> First of all the majority of gaming benchmarks online show the 3900x beating the 9600k. But why would anybody compare a monster 3900x with a 9600k? That ZEN2 CPU is way far ahead of it in everything you throw at it. A better comparison and similarly priced CPU would the the 3600x 6Core/12Thread that competes well against the 9600k 6Core/6Thread in PC Gaming and offers users an upgrade pathway.
> And the argument that Intel is still better at gaming is a age old argument that no longer holds water. Unless people consider a couple FPS swings as a reason to justify buying Intel CPUs that are going EOL.
> 
> 
> ...



Not sure why anyone would quote i5 or R5 for any justification! (most AMD people are cheap and most can't afford 3900X/3950X anyways.... Like taking about the $4000. 3990X monster like it was everyone every day CPU lol

Intel still has the fastest 6 cores CPU ever made!!  (i7-8086K)....hasn't been beaten yet but maybe by the i5-10600K....we will see...if you dare to get into that stupid argument. 

Same with 8 cores CPU's!.... Both 9900KS and 10700K are faster than AMD best 3800X.....

These are facts!... For now... 

Now the only question is the performance between 9900KS and 10700K is more important at the moment. (until AMD 4000 series)


----------



## Super XP (Mar 1, 2020)

Most AMD people are cheap and some can't afford? Lol that's your opinion, and not based on facts.


----------



## ToxicTaZ (Mar 1, 2020)

Super XP said:


> Most AMD people are cheap and some can't afford? Lol that's your opinion, and not based on facts.



Are you applying that most AMD guy are all buying Top end R9 3900X/3950X vs 3800X, 3700X, 3600X?

What sells more? 3900X or 3700X 

Is the statement I said... 

So what again are AMD guys buying? 

It's the same with Intel guys.....more people are going to buy the i7-10700K then the top i9-10900K 

In fact the i5-10600K will out sell both.


----------



## Super XP (Mar 1, 2020)

ToxicTaZ said:


> Are you applying that most AMD guy are all buying Top end R9 3900X/3950X vs 3800X, 3700X, 3600X?
> 
> What sells more? 3900X or 3700X
> 
> ...


You can't just assume that AMD people are cheap. That's simply not the case. 

View AMD people as Smart Buyers. Looking for top performance for fair value. That's a lot better than being ripped off by Nvidia and Intel. Wouldn't you say?


----------



## ToxicTaZ (Mar 1, 2020)

Super XP said:


> You can't just assume that AMD people are cheap. That's simply not the case.
> 
> View AMD people as Smart Buyers. Looking for top performance for fair value. That's a lot better than being ripped off by Nvidia and Intel. Wouldn't you say?



Well when the first thing that comes out of your mouth is price price price...it's really really hard not saying the word cheap. 

90% CPU base buyers are not buying 3950X or 9900KS end of story. 

i5 and R5 sell the most! 
i9 and R9 sell the least amount. 

i5 and R5 are Intel and AMD money makers!!! 

So can say whatever but this is really.


----------



## Super XP (Mar 1, 2020)

ToxicTaZ said:


> Well when the first thing that comes out of your mouth is price price price...it's really really hard not saying the word cheap.
> 
> 90% CPU base buyers are not buying 3950X or 9900KS end of story.
> 
> ...


You shouldn't confuse Cheap Buyers with Smart Buyers. Both AMD & Intel owners have a mix of those that want Value no matter what and those that want the best Price/Performance Processors on the planet. There's absolutely nothing wrong with looking for value & performance. Why on earth would people want to OVERPAY (Intel CPUs) for less when you can get Lower Priced & Higher Performance (AMD CPUs)?   Both AMD & Intel owners are Smart Buyers. Remember that,

*In January 2020 Amazon's TOP Selling Processors*, the 3950X was 3rd hottest selling CPU and 3900X was 8th hottest selling CPU.


> The top 10 processors on Amazon at the time of publishing are as follows:
> 
> 
> *AMD RYzen 5 2600*
> ...




Today Amazon's top selling CPUs, the 3900X is #4, the 3950X is #13.
Newegg - Top selling CPUs Ryzen 3950X #2 & Ryzen Threadripper 3970X #3








						Top Sellers in Processors - Desktops - Newegg.com
					

Newegg.com offers the best prices on computer products, laptop computers, LED LCD TVs, digital cameras, electronics, unlocked phones, office supplies, and more with fast shipping and top-rated customer service. Newegg shopping upgraded ™




					www.newegg.com
				



So far AMD has held the top spot on Amazon's top selling CPUs for over 1 year now. Which explains why AMD rules the processor desktop market by up to 85% market share.


----------



## ToxicTaZ (Mar 1, 2020)

Super XP said:


> You shouldn't confuse Cheap Buyers with Smart Buyers. Both AMD & Intel owners have a mix of those that want Value no matter what and those that want the best Price/Performance Processors on the planet. There's absolutely nothing wrong with looking for value & performance. Why on earth would people want to OVERPAY (Intel CPUs) for less when you can get Lower Priced & Higher Performance (AMD CPUs)?   Both AMD & Intel owners are Smart Buyers. Remember that,
> 
> *In January 2020 Amazon's TOP Selling Processors*, the 3950X was 3rd hottest selling CPU and 3900X was 8th hottest selling CPU.
> 
> ...



That's horrible lol

After all your wasted time broken links Intel still controlling 82% of planet earth x86 CPUs....that leaves AMD @18%....is definitely better than last year. 

No point for paying for AMD since the.....3800X gets whooped buy both 9900KS and 10700K and the 10700K cost less than the 3800X does. 

10700K is 8 core vs core value to the 3800X 

So the smart guy would buy the 10700K because its cheaper and outperforming the 3800X 

Better get your money ready for your new 10700K CPU going by your logic.


----------



## dirtyferret (Mar 1, 2020)

AnarchoPrimitiv said:


> I'm glad someone pointed this out!  The fact is that Intel is NOT the fastest at gaming IN GENERAL, they're ONLY the fastest at gaming in the specific instance of low resolution (1080p or lower), high refresh, and with a 2080 ti (or whatever the top consumer card may be at the moment).


So Intel is "ONLY" faster in scenarios where as much of the GPU bottleneck is removed. "IN GENERAL" as long as you don't want high refresh rates and won't ever upgrade your GPU past a 2080 ti you are better off with AMD?  Got it!

P.S. your cap lock key may be broken


----------



## Super XP (Mar 1, 2020)

ToxicTaZ said:


> That's horrible lol
> 
> After all your wasted time broken links Intel still controlling 82% of planet earth x86 CPUs....that leaves AMD @18%....is definitely better than last year.
> 
> ...


Why would I buy an obsolete 10th Gen Intel CPU that runs hot and has security vulnerabilities lol and the 3800X blows it away in multithreading.


----------



## EarthDog (Mar 1, 2020)

Super XP said:


> Why would I buy an obsolete 10th Gen Intel CPU that runs hot and has security vulnerabilities lol and the 3800X blows it away in multithreading.


There are reasons not to buy them....price compared to same core count to name one reason...

However, as a home user, these security issues dont mean much at all. Have you read about them and know how they work (respectfully, it doesnt sound like it)?

You dont think amd cpus arent generally temperature limited too? Are 10th gens out and benchmarked? I mean core for core thread for thread, they arent much faster. Then you can actually overclock the Intel cpu...of course to your cooling limit, but 5ghz is typically in the cards with high-end air/2x120+ aio. 4.3 ghz is about all every zen2 cpu reaches...



Super XP said:


> Which explains why AMD rules the processor desktop market by up to 85% market share.


Can you provide a link that shows amd with 85% market share please...



Super XP said:


> You can't just assume that AMD people are cheap. That's simply not the case


We cant? That was their whole 'thing' prior to Zen2 (not zen, or zen+). This is the the first time AMD has had the (multithreaded) performance crown in well over a decade.. amd was 'built'on being cheap... so yes, I'd agree many are looking for cheap. The difference today is they have a great product to compete and in some cases win against the same core count processor versus the trash they put out prior to zen.


----------



## Super XP (Mar 1, 2020)

EarthDog said:


> There are reasons not to buy them....price compared to same core count to name one reason...
> 
> However, as a home user, these security issues dont mean much at all. Have you read about them and know how they work (respectfully, it doesnt sound like it)?
> 
> ...


I agree to a certain extent. It took AMD years to finally come out with something very competitive. But all I keep reading is massive negativity from those that refuse to accept ZEN as a viable CPU option. Technologically ZEN2 is well ahead of Intel in architecture, manufacturing process, price and performance. Anyhow I've debated this point to kingdom come. Lol just give AMD credit for coming out with ZEN and disrupting the entire CPU market, the same market that stagnated since 2011. 

Here. Basing that on a major German retailer.





__





						AMD dominates Intel with 82% market share with major German retailer
					

Germany's largest retailer sees AMD domination with 82% market share, Intel losing big time.




					www.tweaktown.com


----------



## EarthDog (Mar 1, 2020)

Super XP said:


> Here. Basing that on a major German retailer.
> 
> 
> 
> ...


So...one random retail store chain in Germany suddenly means amd has 85% market share??? Do you know what market share means (again, respectfully, your answer implies you do not).

https://www.google.com/url?sa=t&sou...DBAJ&usg=AOvVaw0Pyt-2skNRMDSMla9GV6y4&ampcf=1[/URL]


----------



## Super XP (Mar 1, 2020)

EarthDog said:


> So...one random retail store chain in Germany suddenly means amd has 85% market share??? Do you know what market share means (again, respectfully, your answer implies you do not).


Truthfully I remembered this from last year but forgot the actual details. So I should have refrased my comment better.
It's one of the largest retailers in Germany. That alone shows a high probability it's happening in other retailers, including Amazon, Newegg etc.,


----------



## EarthDog (Mar 1, 2020)

Super XP said:


> Truthfully I remembered this from last year but forgot the actual details. So I should have refrased my comment better.
> It's one of the largest retailers in Germany. That alone shows a high probability it's happening in other retailers, including Amazon, Newegg etc.,


It does not. Correlation is not causation.

Point is, they are gaining market share...nobody doubts that. However saying they have "up to 85%" is complete FUD (bullshit). One retailer in one country and their sales over a year is NOT marketshare! Use your head.


----------



## Super XP (Mar 1, 2020)

EarthDog said:


> It does not. Correlation is not causation.
> 
> Point is, they are gaining market share...nobody doubts that. However saying they have "up to 85%" is complete FUD (bullshit). One retailer in one country and their sales over a year is NOT marketshare! Use your head.


It's a good thing I did use my head. 








						AMD Ryzen Overtakes Intel Core CPUs Market Share In Major Asian Pacific Markets, Highest CPU Share in 5 Years
					

AMD has gained massive market share with their Ryzen processors versus Intel's Core lineup in major Asian markets such as Korea and Japan.




					wccftech.com


----------



## EarthDog (Mar 1, 2020)

Super XP said:


> It's a good thing I did use my head.
> 
> 
> 
> ...


Again, current # of sales /= market share.

Here is what Market Share means: 





> The *market share* is calculated by dividing the volume of goods sold by a particular firm by the total number of units in the *market*.



AMD is gaining market share due to how well Ryzen has been selling, no doubt, however they are not close to owning overall market share.


----------



## ToxicTaZ (Mar 1, 2020)

Super XP said:


> Why would I buy an obsolete 10th Gen Intel CPU that runs hot and has security vulnerabilities lol and the 3800X blows it away in multithreading.



The 10700K using less power and runs cooler than the 9900KS and "both" 9900KS and 10700K don't have vulnerabilities! That was all fixed with stepping 13 if you did your research. And the 10700K is above the 3800X in performance everything and with a lower price point! (Best 8 cores vs best 8 cores) "we are not talking expensive multicore 3900X/3950X here... 

Why would you pay more for less performance buying a 3800X... No point.... If you want the best 8 cores CPU that's cheaper than your 3800X then your 10700K is best smart value!


----------



## jabbadap (Mar 1, 2020)

Hmm, I have always though 3800x is that pointless cpu anyway. Buy 3700x for cheaper and OC that. Stock TDPs are meant to be broken.

Haven't really followed cpu market lately(Cpus are quite dull). Will that 10th series be pin compatible with upcoming 10nm processors too(LGA1200 or what was it again), or is there even 10nm cpu on the radar for desktop.


----------



## ToxicTaZ (Mar 1, 2020)

jabbadap said:


> Hmm, I have always though 3800x is that pointless cpu anyway. Buy 3700x for cheaper and OC that. Stock TDPs are meant to be broken.
> 
> Haven't really followed cpu market lately(Cpus are quite dull). Will that 10th series be pin compatible with upcoming 10nm processors too(LGA1200 or what was it again), or is there even 10nm cpu on the radar for desktop.



10nm+ is for mobile devices only! 

Intel TigerLake 10nm++ mobile CPU should be in devices like the Microsoft Surface Pro 8 around Hollween 

Desktop will remain on 14nm++ till Intel new factory Fab42 up and running with full 7nm/5nm ready from the start.

10th generation is on new LGA 1200 socket PCIe 4.0 board but PCIe 4.0 won't be working until 11th gen (Rocket Lake) around this Christmas. So you can say 10th Gen is short lived. And no 10th generation is not backwards compatibility with Intel 300 series boards.....the last best CPU for the Intel 300 series is the I9-9900KS

Intel will remain 14nm++ on LGA 1200 socket PCIe 4.0 Motherboard will have Comet Lake, Rocket Lake, Alder Lake CPUs on it!

Intel Meteor Lake is Intel first 7nm CPUs product line from the new Fab42 factory.

Intel 7nm/5nm will be on LGA 1700 socket PCIe 5.0 Motherboard with DDR5 and USB-4, WiFi-6E etc... At the same time AMD Zen 5 with new AM5 socket to stop backwards compatibility.

For now if you want a 8 cores CPU with higher performance than 3800X at a lower 9700K price point then get the 10700K.... Comes with 5.3GHz Turbo lol


----------



## Super XP (Mar 1, 2020)

ToxicTaZ said:


> The 10700K using less power and runs cooler than the 9900KS and "both" 9900KS and 10700K don't have vulnerabilities! That was all fixed with stepping 13 if you did your research. And the 10700K is above the 3800X in performance everything and with a lower price point! (Best 8 cores vs best 8 cores) "we are not talking expensive multicore 3900X/3950X here...
> 
> Why would you pay more for less performance buying a 3800X... No point.... If you want the best 8 cores CPU that's cheaper than your 3800X then your 10700K is best smart value!


The Core i9 10900K hits up to 300W under load. And is still based on the 14nm+++ process, which explains the wattage increase. Just going by that 
The 10700K is an unreleased product at the moment, and it should be faster than the R7 3800X Ryzen (July 7, 2019) as its a much newer processor. 
And thanks for letting me know about the 10th Gen with the stepping 13 that they fixed the security vulnerabilities, I didn't know that.


----------



## HenrySomeone (Mar 3, 2020)

ToxicTaZ said:


> The 10700K using less power and runs cooler than the 9900KS and "both" 9900KS and 10700K don't have vulnerabilities! That was all fixed with stepping 13 if you did your research. And the 10700K is above the 3800X in performance everything and with a lower price point! (Best 8 cores vs best 8 cores) "we are not talking expensive multicore 3900X/3950X here...
> 
> Why would you pay more for less performance buying a 3800X... No point.... If you want the best 8 cores CPU that's cheaper than your 3800X then your 10700K is best smart value!


Precisely - AMD got a bit of a headstart as far as price/performance 8/16 cpus go, but that time is ending and as far as pure performance goes, there was never any doubt, who's the (desktop) king (they do have a relatively good product with their 3960 & 3970x though for those who really need that many cores (3990x on the other hand is once again just a gimmick, just like the 2990x))


----------



## ToxicTaZ (Mar 3, 2020)

HenrySomeone said:


> Precisely - AMD got a bit of a headstart as far as price/performance 8/16 cpus go, but that time is ending and as far as pure performance goes, there was never any doubt, who's the (desktop) king (they do have a relatively good product with their 3960 & 3970x though for those who really need that many cores (3990x on the other hand is once again just a gimmick, just like the 2990x))



This is a dual channel CPU debates why would you start talking about expensive Quad channels TreadRippers? I think you're very confused of product lines??? Or did you mean the expensive dual channel 3900X/3950X? 

When we are talking about dual channel 8 cores vs 8 cores CPUs....basically AMD only has the 3800X, 3700X, 2700K-(50th Anniversary Edition), 2700K vs Intel 10700K, 9900KS, 9900K, 9700K CPUs 

The 10700K and 3800X as you would say have similar performance while the 10700K being slightly above AMD best Dual channel 8 cores CPU the 3800X and the 10700K will be cheaper than the 3800X as well. 

If you want more performance with less price get the 10700K and if you don't mind paying more and less performance and saving a little bit on your power bill get the 3800X. 

10700K looks to be the true value 8 cores here. 

Until the AMD 4000 series of course!


----------



## HenrySomeone (Mar 3, 2020)

Agreed, that's why I said Intel is still the undisputed desktop king; AMD has some useful options in the HEDT, but they still lack the single core punch and low latency to rival team blue where it matters most. And 4000 series will only narrow the gap, it's not gonna fully close it.


----------



## Super XP (Mar 3, 2020)

HenrySomeone said:


> Precisely - AMD got a bit of a headstart as far as price/performance 8/16 cpus go, but that time is ending and as far as pure performance goes, there was never any doubt, who's the (desktop) king (they do have a relatively good product with their 3960 & 3970x though for those who really need that many cores (3990x on the other hand is once again just a gimmick, just like the 2990x))


That 3990X Threadripper 64-Core / 128-Thread monster is a choice for those that require that horse power of a processor. 

It has nothing to do with AMDs Mainstream/High end lineups. Threadripper is for Enthusiast and heavy power users. And it's a great lineup as many people are buying it for massive multi threading starved programs. You don't buy Threadripper for PC Gaming but it can still do that too. Just not for the price tag as a lone gaming PC.



HenrySomeone said:


> Agreed, that's why I said Intel is still the undisputed desktop king; AMD has some useful options in the HEDT, but they still lack the single core punch and low latency to rival team blue where it matters most. And 4000 series will only narrow the gap, it's not gonna fully close it.


Based on reliable sources, for PC Gaming, ZEN3 is said to be "On Average" 17% IPC increase over ZEN2 clock 4 clock. That's an average, as some games will see up to 30-50% and others won't see more than 1%. It's game specific according to what I've read. And it's Floating Point increase is 50%+ over ZEN2. 

We need fair competition from both comapanies. Nobody wants to see either go bust or we would have Industry Technology Stagnation. Just look what happened when AMDs Bulldozer did not live up to its hype. We have at least 5 full years of CPU stagnation. Now look at the industry with ZEN, ZEN+ (ZEN+ was suppose to have been original ZEN) and ZEN2, it's very competitive and wins hands down in multi threading. This ZEN push woke up the beast inside Intel so bring on the competition so consumers can Benefit with fair pricing and solid performance gains.


----------



## Nkd (Mar 4, 2020)

ToxicTaZ said:


> You can bolstering all you want but if your 105w CPU can't outperforming 125w CPUs at every day tasks and of course PC Gaming.... The jokes on AMD
> 
> Already stock 9900KS 127w is already faster than stock 3800X 105w
> 
> ...



you missed my point. 9900KS is not 127w running at its all core boost. Its never running at base clock in gaming. I can promise you, I have had 9900k and when its running at all core boost which is almost always under gaming its burning close to 200w. So that is my point, so no 9900k is not beating the 3800x at 105w. Intel CPUs use a lot more power at their all core boost speed which they basically run all the time under load.


----------



## TranceHead (Mar 8, 2020)

Super XP said:


> I agree to a certain extent. It took AMD years to finally come out with something very competitive. But all I keep reading is massive negativity from those that refuse to accept ZEN as a viable CPU option. Technologically ZEN2 is well ahead of Intel in architecture, manufacturing process, price and performance. Anyhow I've debated this point to kingdom come. Lol just give AMD credit for coming out with ZEN and disrupting the entire CPU market, the same market that stagnated since 2011.
> 
> Here. Basing that on a major German retailer.
> 
> ...


Intel has 82% of the desktop market share currently, AMD has 18%

Intel 82% market share


----------



## Super XP (Mar 8, 2020)

TranceHead said:


> Intel has 82% of the desktop market share currently, AMD has 18%
> 
> Intel 82% market share


As of December 2019 AMDs CPU market share stood at 33%. By the end of 2020 and the introduction of ZEN3 I can see that market share hit 40% or more.

Q3 2019 numbers, AMD’s share stands at 32%


> Intel controls most of the CPU market with its share growing from 77% in 2014 to *82% in 2016*, and back at *77% in 2018*.






> AMD’s share fell from *23% in 2014* to *18% in 2016*, but it has been on a rise since then to *23% in 2018*.
> [*]In fact, if we look at *Q3 2019 numbers*, *AMD’s share stands at 32%*.
> [*]This can be attributed to the success of its Ryzen processors, which offer comparable performance for a cheaper price when compared to Intel.


----------



## TranceHead (Mar 8, 2020)

Super XP said:


> As of December 2019 AMDs CPU market share stood at 33%. By the end of 2020 and the introduction of ZEN3 I can see that market share hit 40% or more.
> 
> Q3 2019 numbers, AMD’s share stands at 32%


"Desktop market share"
That was what we were talking about, wasn't it?


----------



## Super XP (Mar 8, 2020)

TranceHead said:


> "Desktop market share"
> That was what we were talking about, wasn't it?


Was talking about major retailers selling over 85% more Ryzen CPUs. In Germany and most of Asian.


----------



## TranceHead (Mar 8, 2020)

Super XP said:


> Was talking about major retailers selling over 85% more Ryzen CPUs. In Germany and most of Asian.


Then you need to choose your words more carefully.
Because worldwide, Intel holds the desktop market share 82% to AMDs 18%


----------



## WeeRab (Mar 9, 2020)

ToxicTaZ said:


> What kinda drug fantasy are you on? If you like it or not stock 9900KS still holding the top record of fastest Gaming CPU! No competition from AMD best 8 cores CPU 3800X or from any R9 series for that matter.....
> 
> LOL you talk about 3950X as it is extremely expensive especially to AMD people that talk all day long about price price price I can't afford anything price price and can't afford anything. You talking about 3990X is even more ridiculous expensive trying to quote price price price all day!
> 
> ...


  All that expense. All that heat. All that noise. Just to play games at 1080p.  LOL.


----------



## Braggingrights (Apr 28, 2020)

WeeRab said:


> All that expense. All that heat. All that noise. Just to play games at 1080p.  LOL.



Expense is for office productivity, this is enthusiast and damn the cost
Heat? Are we talking about Ryzen's now, because in every test they burn like the sun compared to the Intel counterpart
1080p: E-Sports dude, the whole world is chasing frames and latency and your boys just don't deliver


----------



## ToxicTaZ (Apr 28, 2020)

WeeRab said:


> All that expense. All that heat. All that noise. Just to play games at 1080p.  LOL.



Unfortunately I have no idea what issues you're talking about? I'm a proud owner 9900KS @5.2GHz and it's all around fantastic OCing CPU!

Heat?.....my RIG build is custom Cooled By EK!....my RIG cooling system can handle 420w. (idle 25c)(load 55c)

Noise?.....my RIG build is 30db only!

Playing games 1080p?.... How do you know what resolutions I play games at?....I'm using "1600p" by the way LG UltraGear 38GL950G-B monitor (3840x1600) and Nvidia RTX 2080 NVlink setup to drive it!

Why do you make stupid accusations about me? When you know absolutely nothing about PC Gaming set-ups!

9900KS is a amazing CPU and will keep me going till my next RIG build (Intel 13th generation i9) "Meteor Lake" on Intel 700 series H6 LGA 1700 socket PCIe 5.0 with DDR5 and USB-4 and all the other goodies!

And yes! Intel Meteor Lake is 7nm+

Looking forward for ASUS ROG Maximus XV Formula Motherboard in my future.

I'm skipping 10nm and H5 LGA 1200 socket PCIe 4.0 board Fiasco altogether.


----------



## Vayra86 (Apr 28, 2020)

Braggingrights said:


> Expense is for office productivity, this is enthusiast and damn the cost
> Heat? Are we talking about Ryzen's now, because in every test they burn like the sun compared to the Intel counterpart
> 1080p: E-Sports dude, the whole world is chasing frames and latency and your boys just don't deliver



8th gen Coffee Lake wants its statements back. You're a little bit behind it seems, right now Ryzen delivers perfectly fine FPS across the whole spectrum. Well, maybe except for that tiny, and completely unimportant niche that chases 240fps/240hz and will probably brag about 480fps CS GO next year while they're sniped by cheaters. If Intel caters to that, they can have it 

Ryzen now has lower temps, lower power, higher IPC, and Intel actually just needs its high turbo to catch up and keep pace. Another big plus is that Ryzen has higher base clocks across the board, which make it perform noticeably better in heat- and form factor restricted situations. ie Laptops. Intel already lost that crown too.

Oh, did I mention its cheaper and usually has better SMT as well?


----------



## Braggingrights (Apr 28, 2020)

Vayra86 said:


> 8th gen Coffee Lake wants its statements back. You're a little bit behind it seems, right now Ryzen delivers perfectly fine FPS across the whole spectrum. Well, maybe except for that tiny, and completely unimportant niche that chases 240fps/240hz and will probably brag about 480fps CS GO next year while they're sniped by cheaters. If Intel caters to that, they can have it
> 
> Ryzen now has lower temps, lower power, higher IPC, and Intel actually just needs its high turbo to catch up and keep pace. Another big plus is that Ryzen has higher base clocks across the board, which make it perform noticeably better in heat- and form factor restricted situations. ie Laptops. Intel already lost that crown too.
> 
> Oh, did I mention its cheaper and usually has better SMT as well?


Ahh yes the unimportant niche that keeps you fanboyz up so late at night, too bad slowpoke... you better get back to that hot unstable mess that hates nvidia cards so you can lose COD again 

Uh oh, you didn't even get on the podium son 



			CPU UserBenchmarks - 1370 Processors Compared


----------



## Vayra86 (Apr 28, 2020)

Braggingrights said:


> Ahh yes the unimportant niche that keeps you fanboyz up so late at night, too bad slowpoke... you better get back to that hot unstable mess that hates nvidia cards so you can lose COD again
> 
> Uh oh, you didn't even get on the podium son
> 
> ...



Maybe you oughta click my system specs before you troll on.

With your attitude you are clearly on the wrong forum, WCCFTech & Reddit is _that way._


----------



## Braggingrights (Apr 28, 2020)

Vayra86 said:


> Maybe you oughta click my system specs before you troll on.
> 
> With your attitude you are clearly on the wrong forum, WCCFTech & Reddit is _that way._


Hit a nerve did I speccy


----------



## Super XP (Apr 28, 2020)

ARF said:


> This chip won't be DOA and not needed on the market only if it costs around $250-$300.
> 
> That was because the *Athlons had tremendously higher IPC*, while the pentium was designed for *high clocks* with very long execution pipeline.


Athlon 64's IMC helped with those much higher IPCs along with its unique Hyper Transport Technology, AMD helped Co-Design/Make. 
Back in the day, Intel kept boosting the clock speeds, because it thought that is what made the processors sell so well. From my memory of course lol


----------



## Braggingrights (Apr 28, 2020)

Cheap as chips, pardon the pun









						Intel 10th Gen Comet Lake-S Desktop CPU Final Specifications And Prices Leak Out - Core i9-10900K 10 Core For $488 US, Core i7-10700K 8 Core For $374 US, Core i5 6 Core Starting at $150 US
					

The final specifications & prices of Intel's upcoming 10th Generation Comet Lake-S Desktop Core i9, i7, i5 & i3 CPU have leaked out




					wccftech.com


----------



## EarthDog (Apr 28, 2020)

Braggingrights said:


> Cheap as chips, pardon the pun
> 
> 
> 
> ...


I wouldn't call that cheap unless you are in a vacuum. If we look around, your choices are a 3800x for $399 (on sale now for $350 - Intel should should handily beat this in single and multi thread), and the 3900x at $489.99. Here, single thread will go to Intel (thanks clock speed), but multi-threaded AMD would get the nod.

It depends.


----------



## Braggingrights (Apr 28, 2020)

EarthDog said:


> I wouldn't call that cheap unless you are in a vacuum. If we look around, your choices are a 3800x for $399 (on sale now for $350 - Intel should should handily beat this in single and multi thread), and the 3900x at $489.99. Here, single thread will go to Intel (thanks clock speed), but multi-threaded AMD would get the nod.
> 
> It depends.


It's cheap as in my situation, stability and silence is all I care about so I wouldn't even look at red... and I don't do windows or vacuuming, that's just me man


----------



## EarthDog (Apr 28, 2020)

Braggingrights said:


> It's cheap as in my situation, stability and silence is all I care about so I wouldn't even look at red... and I don't do windows or vacuuming, that's just me man


Ok...

Don't mind the fact that you can run the same cooler on a 3900x that you would on this CPU and it would output less heat (and therefore be able to run the same speed or less = quiet)...

You're...uhh, pretty bawls deep in Intel aren't you, bud...


----------



## Braggingrights (Apr 28, 2020)

EarthDog said:


> Ok...
> 
> Don't mind the fact that you can run the same cooler on a 3900x that you would on this CPU and it would output less heat (and therefore be able to run the same speed or less = quiet)...
> 
> You're...uhh, pretty bawls deep in Intel aren't you, bud...


Nup, fave chip all time was an AMD... but you're the one that responded to my post to push your agenda. 

Infantile fandom for either company really wouldn't make sense unless you are a very heavy investor in one of them... bud


----------



## EarthDog (Apr 28, 2020)

Braggingrights said:


> Nup, fave chip all time was an AMD... but you're the one that responded to my post to push your agenda.
> 
> Infantile fandom for either company really wouldn't make sense unless you are a very heavy investor in one of them... bud


I don't have an agenda.... ease off the hackles.

You mentioned quiet and stability... and both are inherently stable, so that is a net wash... and since both the 3800x and 3900x use less power and output less heat, it could run as quiet or quieter. Depending on your(read: anyone's) uses, you could benefit from the additional cores/threads, or from the clock speeds. Both can be viable. 

I was simply trying to give 'cheap' perspective (like against its competition).


----------



## ARF (Apr 28, 2020)

Braggingrights said:


> Cheap as chips, pardon the pun
> 
> 
> 
> ...




Quite good  Suddenly AMD is no longer that competitive. Expect price cuts from AMD.

Core i9-10900K 10C/20T $488
Core i9-10900KF 10C/20T $472
Core i9-10900 10C/20T $439
Core i9-10900F 10C/20T $422
Core i9-10900T 10C/20T TBD
Core i7-10700K 8C/16T $374
Core i7-10700KF 8C/16T $349
Core i7-10700 8C/16T $323
Core i7-10700F 8C/16T $298
Core i7-10700T 8C/16T TBD
Core i5-10600K 6C/12T $262
Core i5-10600KF 6C/12T $237
Core i5-10600 6C/12T $213
Core i5-10600T 6C/12T TBD
Core i5-10500 6C/12T $192
Core i5-10500T 6C/12T TBD
Core i5-10400 6C/12T $182
Core i5-10400F 6C/12T $157
Core i3-10350K 4C/8T TBD
Core i3-10320 4C/8T $154
Core i3-10300 4C/8T $143
Core i3-10100 4C/8T $122
Core i3-10100T 4C/8T TBD
Pentium G6600 2C/4T $86
Pentium G6500 2C/4T $75
Pentium G6400 2C/4T $64
Pentium G6400T 2C/4T TBD
Celeron G5900 2C/2T $52
Celeron G5900T 2C/2T TBD


Ryzen 9 3950X 16C/32T $738
Ryzen 9 3900X 12C/24T $490
Ryzen 7 3800X 8C/16T $345 down from $400
Ryzen 7 3700X 8C/16T $299
Ryzen 5 3600X 6C/12T $205
Ryzen 5 3600 6C/12T $190


----------



## Braggingrights (Apr 28, 2020)

EarthDog said:


> I don't have an agenda.... ease off the hackles.


Says the guy using the crazy face and having a melty... I'll buy red (as I have in the past) when they are compelling, I don't find them that at the moment, yes you can make a value argument, but if it's based on that crappy wraith cooler then don't talk about heat and noise, and if you do then stop talking about value, can't have it both ways. And like it or not there are more instability stories about red than blue, for me that's critical... but hey maybe next gen if blue can't pull another rabbit out of the hat backporting Willow to 14nm


----------



## EarthDog (Apr 28, 2020)

Braggingrights said:


> Says the guy using the crazy face and having a melty... I'll buy red (as I have in the past) when they are compelling, I don't find them that at the moment, yes you can make a value argument, but if it's based on that crappy wraith cooler then don't talk about heat and noise, and if you do then stop talking about value, can't have it both ways. And like it or not there are more instability stories about red than blue, for me that's critical... but hey maybe next gen if blue can't pull another rabbit out of the hat backporting Willow to 14nm


It's based on whatever cooler you put on either. The point you may have missed is they use less power which can yield the same level of quiet or better. My personal exposure to both platforms yielded the same levels of stability, stock or overclocked. 

Surely you have your reasons, but the ones listed dont have much when held up to the light. 

Happy trails!


----------



## ARF (Apr 28, 2020)

AMD's pricing is crazy. The difference between Ryzen 9 3900X and Ryzen 9 3950X is $248 for 4C/8T.
No one sells a quad-core performance for so much money


----------



## Caring1 (Apr 28, 2020)

ARF said:


> Quite good  Suddenly AMD is no longer that competitive. Expect price cuts from AMD.
> 
> Core i9-10900K 10C/20T $488
> Core i9-10900KF 10C/20T $472
> ...


A direct price comparison is irrelevant without clockspeeds and TDP.


----------



## Braggingrights (Apr 28, 2020)

EarthDog said:


> It's based on whatever cooler you put on either. The point you may have missed is they use less power which can yield the same level of quiet or better. My personal exposure to both platforms yielded the same levels of stability, stock or overclocked.
> 
> Surely you have your reasons, but the ones listed dont have much when held up to the light.
> 
> Happy trails!


Oh the power, the power, I'll be able to run an extra lightbulb at night while I get thrashed in PUBG, if it doesn't randomly reboot of course. It's just not compelling, sorry if that isn't what you want to hear but it's not exactly like we are living in a golden age anyway... come back with me if you will to a time when giants roamed the earth, the mighty Penny 4 could only surprise if it didn't overclock 50% at stock voltage... or the bird, the majestic 1GHz T-Bird, the game changer, say what you will about numbers but those big round ones don't get any smaller... but here I am ranting on about Elvis and the Beatles and you live in the age of Justin Bieber, you have my condolences


----------



## EarthDog (Apr 28, 2020)

What's your pubg/steam handle... I'll thrash you. Lol

I really dont know how to respond to the rest of that drivel though, sorry.

Nobody gives a hoot about power use when it comes to cost...the point was cooling and quiet...


----------



## Braggingrights (Apr 28, 2020)

Caring1 said:


> A direct price comparison is irrelevant without clockspeeds and TDP.


The whole TDP thing is just perspective, you don't dial back the power on the Large Hadron Collider because you are more worried about the electricity bill than finding the Higgs, I get it's important for some people but it's not across the board



EarthDog said:


> What's your pubg/steam handle... I'll thrash you. Lol
> 
> I really dont know how to respond to the rest of that drivel though, sorry.


Yeah I was worried it might not clear that pointy bit at the top 

As for PUBG you can always dream, I'm sure we'll meet one dank and darky


----------



## ARF (Apr 28, 2020)

Caring1 said:


> A direct price comparison is irrelevant without clockspeeds and TDP.




















						Intel 10th Gen Core (Comet Lake-S) Final Specs and Pricing leaked - VideoCardz.com
					

Intel has just confirmed the final pricing and specifications of its upcoming 10th Gen Core-S series. Update: We have added a presentation leaked by HD-Tecnologia. Intel Core i9-10900K: 10 cores, 488 USD and a new box Intel will formally announce the Comet Lake-S series on April 30. The new CPUs...




					videocardz.com


----------



## EarthDog (Apr 28, 2020)

Braggingrights said:


> As for PUBG you can always dream, I'm sure we'll meet one dank and darky


I'm dank as F. 

Carry me!


----------



## Braggingrights (Apr 28, 2020)

EarthDog said:


> I'm dank as F.
> 
> Carry me!


I can see I'd have to carry you the whole game, Intel system will cure those random reboots


----------



## ARF (Apr 28, 2020)

It's bad timing to buy a PC. Zen 3 is coming in less than 6 months and will be the last compatible CPU for the old AM4 platform.

Those Pentiums and Celerons with 2 threads and 58-watt TDP are crazy. Poor people who would have them, tell everyone NOT to buy those


----------



## Braggingrights (Apr 28, 2020)

ARF said:


> It's bad timing to buy a PC. Zen 3 is coming in less than 6 months


In 6 months it'll be a bad time to buy a PC with Zen 4 coming


----------



## ARF (Apr 28, 2020)

Braggingrights said:


> In 6 months it'll be a bad time to buy a PC with Zen 4 coming




Yes, I mean 2020 and 2021 till the Zen 4 launch is bad time for a new PC.
Rocket Lake not but whatever comes after it should be more interesting as well.


----------



## Braggingrights (Apr 28, 2020)

ARF said:


> Yes, I mean 2020 and 2021 till the Zen 4 launch is bad time for a new PC.
> Rocket Lake not but whatever comes after it should be more interesting as well.


The Willow backport will make it interesting, 5.5Ghz plus?


----------



## ToxicTaZ (Apr 28, 2020)

Braggingrights said:


> In 6 months it'll be a bad time to buy a PC with Zen 4 coming



Keep and mind Zen 3 is end of the line for all AM4 socket boards! 

Zen 4 is AM5 socket for only new system builders and not backwards compatibility with all AM4 socket boards! 

At the same time as Intel also has another new socket too.... Intel H6 LGA 1700 socket with new architecture...Intel 12th gen 16 cores (Alder Lake) 10nm++

I personally am waiting for Intel 13th gen (Meteor Lake) 7nm+ on second generation H6 socket. 

It's going to be battle of the Sockets very soon!


----------



## ARF (Apr 29, 2020)

Braggingrights said:


> The Willow backport will make it interesting, 5.5Ghz plus?




Maybe not. Normally, IPC increase of 20% or so means lower frequencies, not higher frequencies.



ToxicTaZ said:


> Keep and mind Zen 3 is end of the line for all AM4 socket boards!
> 
> Zen 4 is AM5 socket for only new system builders and not backwards compatibility with all AM4 socket boards!
> 
> ...




13th generation? I think they should start from 1st generation once BIG.little approach is implemented.


----------



## Vayra86 (Apr 29, 2020)

Braggingrights said:


> Hit a nerve did I speccy



Nah just trying to prevent you from looking silly  Failed miserably, too


----------



## Braggingrights (Apr 29, 2020)

Vayra86 said:


> Failed miserably, too


I'm sure you're used to it


----------



## Chrispy_ (Apr 29, 2020)

ARF said:


> AMD's pricing is crazy. The difference between Ryzen 9 3900X and Ryzen 9 3950X is $248 for 4C/8T.
> No one sells a quad-core performance for so much money


Hillarious! You're mocking AMD for something that you just showed Intel do, at a core count Intel can't even offer.

You've just earned another fanboy certificate, well done!


----------



## ARF (Apr 29, 2020)

The difference between 10-core Core i9-9900F and 6-core Core i5-10600K is $160

AND

we don't know if they actually use the same die - I bet NO!

Also, Ryzen 9 3900X is just a salvage part of Ryzen 9 3950X. Everything is the same, except the binning and maybe artificially disabled CCX...


----------



## Vayra86 (Apr 29, 2020)

ARF said:


> Everything is the same, except the binning and maybe artificially disabled CCX...



Yes, this is how product stacks are formed out of a single chip, and it happens everywhere... Its the whole reason AMD can do what it does now.

Also... what sort of weird comparison is this? per core price? We don't do this when we count GPU shaders either, do we? We know the upper end has a markup bigger than its relative performance.

If you want to compare core counts by price gap, you also have to take into account the relative number of cores of the low end compared to top end. After all, if you have more total cores to spread your cost difference across, you'd expect a different pricing structure too.

So really, until Intel and AMD sell like-for-like cores AND core counts across the whole stack, you can't just put dollars side by side and say one is better or worse than the other.


----------



## Chrispy_ (Apr 29, 2020)

ARF said:


> The difference between 10-core Core i9-9900F and 6-core Core i5-10600K is $160
> 
> AND
> 
> ...


There we go again, proving your Intel BIAS once more.

Why are you comparing an F to a K all of a sudden? Is it because in a fair comparison your argument falls apart? I think it is.
/sigh


----------



## EarthDog (Apr 29, 2020)

Vayra86 said:


> Yes, this is how product stacks are formed out of a single chip, and it happens everywhere... Its the whole reason AMD can do what it does now.
> 
> Also... what sort of weird comparison is this? per core price? We don't do this when we count GPU shaders either, do we? We know the upper end has a markup bigger than its relative performance.
> 
> ...


Thanks isnt enough... QFT!!

....who cares about gaps between core/thread count? Lol


This may be the first time Alf has been called an Intel fanboy...lol


----------



## ARF (Apr 29, 2020)

Chrispy_ said:


> Why are you comparing an F to a K all of a sudden?



AMD has only a single 12-core and a single 16-core SKU. This is a gigantic problem which they must address.
Intel has much more diverse lineup.
This is fact and I'm not fanboy of Intel - I agree that their product lineup creation is better.


----------



## Vayra86 (Apr 29, 2020)

EarthDog said:


> Thanks isnt enough... QFT!!
> 
> ....who cares about gaps between core/thread count? Lol
> 
> ...



I guess we should chalk that up to these lockdown measures. No. I _hope_ so.


----------



## londiste (Apr 29, 2020)

Lower 6-core models are the cores/$£€ sweetspot. Anything above that will be more expensive per core.
Ryzen 5 3600 and 10400F.


----------



## ToxicTaZ (Apr 29, 2020)

ARF said:


> Maybe not. Normally, IPC increase of 20% or so means lower frequencies, not higher frequencies.
> 
> 
> 
> ...



You're right!! 

As far as I know Intel Alder Lake and Intel Meteor Lake are listed as 12th & 13th generation on new H6 LGA 1700 socket but as all the leaks on the net say that 16 cores Intel Alder Lake (big.LITTLE is brand new architecture and using 10nm++ and Intel Meteor Lake is 7nm+ (Intel Alder Lake Refresh) both on H6 LGA 1700 socket PCIe 5.0 with DDR5 and USB-4 

To my knowledge Intel Rocket Lake memory controller is PCIe 4.0 for H5 LGA 1200 socket PCIe 4.0 boards 400/500 series. 

To my knowledge Intel Meteor Lake memory controller is PCIe 5.0 for H6 LGA 1700 socket PCIe 5.0 boards 600/700 series. 

To my knowledge AMD AM5 is PCIe 5.0 with DDR5 and USB-4 from the start! (5nm) 

You can do your own research....but it seems clear whats coming. 

Yeah Intel should call Intel Alder Lake & Intel Meteor Lake as first #1 & #2  of new architecture....


----------



## WeeRab (Apr 29, 2020)

Is there an actual cooler out there that can dissipate 300+w in a normal case? Air OR Water?


----------



## londiste (Apr 29, 2020)

WeeRab said:


> Is there an actual cooler out there that can dissipate 300+w in a normal case? Air OR Water?


240mm radiator should be reasonably OK for doing that. Bigger air coolers are capable as well although noise might be a problem. Look at Intel's HEDT or Threadrippers for some ideas on what working with that kind of heat is like.


----------



## ARF (Apr 29, 2020)

WeeRab said:


> Is there an actual cooler out there that can dissipate 300+w in a normal case? Air OR Water?




Noctua NH-D15 ?

My Arctic Freezer 13 is rated up to 200W.


----------



## ToxicTaZ (Apr 30, 2020)

WeeRab said:


> Is there an actual cooler out there that can dissipate 300+w in a normal case? Air OR Water?











						PC water cooling solutions and systems by world leader EKWB
					

Premium PC water cooling systems – best heat-removal solutions for your computer. Everything that you need, from custom loops to AIOs for beginners.




					www.ekwb.com


----------



## Braggingrights (Apr 30, 2020)

So they've eked out another 1 GHz roughly from the original 14nm 3 years ago.

They've managed an extra 1 Ghz when it took about 15 years to go from 1Ghz to 4Ghz.

And they've managed to do it at the wrong end of that GHz race and on an aging fab process... it's not a bad effort really.


----------



## Caring1 (Apr 30, 2020)

Titles wrong going by this chart.


			https://www.techpowerup.com/img/22p4myCRlQtZJtNq.jpg


----------



## Earthplayer (May 5, 2020)

Braggingrights said:


> Expense is for office productivity, this is enthusiast and damn the cost
> Heat? Are we talking about Ryzen's now, because in every test they burn like the sun compared to the Intel counterpart
> 1080p: E-Sports dude, the whole world is chasing frames and latency and your boys just don't deliver


Which tests are you talking about? All those lies and argumentum ad hominem in almost every post from you are very sad.

Chips like the 3600/3600x/3700x consume a lot less power compared to their intel counterparts. Except for idle consumption, intel pulls slightly ahead there. (doesn't matter for us as we don't use our PCs just to browse the web - always gaming or working else turning it off) And the new 10th gen chips from Intel actually have a TDP for their baseclock and not boost. (this is why the "forced 95w TDP mode" shows a 3.5ghz clock compared to the 125w TDP at 3.8ghz) The turbo clock power consumption (power consumption = heat) goes above 200w on the 10700k. And that's just for the chip without anything else. You say AMD chips would be hotter than Intel chips which is simply not true.

For some power consumption (and hence heat production) tables see the link below. Same performance intel chips tend to run hotter/use more power than AMD chips (Zen 2).



			https://images.anandtech.com/graphs/graph14605/111362.png
		


The older Zen 1/Zen 1 + chips produced a lot more heat than the Zen 2 chips though. Which is to be expected when going down to 7nm with Zen 2. Zen 3 is going to be rather interesting. If you want to have the highest FPS numbers in games possible you could still go for an intel processor. But you will have to go with AMD if you don't want your system to basically be a heater, want it to be a lot more quiet and depending on where you live save a decent amount of money from your electricity bill (sure, USA has cheap electricity but most countries like mine have double to triple the cost per kwh - a 50w difference with 4 hours of high workload or gaming per day can easily eat 30-40€ per year where I live). A system which runs cooler, runs quiet, is cheaper and consumes less electricity is well worth a ~5% difference in framerate in most games. And games actually using 8 cores / 16 threads will become a lot better with the new console generation releasing soon which might mitigate that difference in the future or even turn the numbers around. Most CPU bound games currently have a larger issue with draw calls more than anything though so Vulkan/DX12 should fix the multithreading bottlenecks eventually. (doesn't matter if a game like Planet Coaster can use 16 threads if the draw calls are bound to a two threads bottlenecking even the best processors to below 60fps in lategame while two threads run at 100% and all others run at 30%)

And on the stability side of things: Early Ryzen chips definitly had issues. But BIOS updates and Chipset updates (download those directly from AMD - they almost always have a newer version than the motherboard manufacturer) fixed all common instabilities. Some people still have issues due to some very specific combination of hardware and software installed or some bad bios or windows settings but personally neither my wife (amd 3700x) nor me (intel 7700k) have had any issues. Neither of those systems had a BSOD in the past six months (got the 3700x back then, she was still on a 4770k before that) and both are used for gaming and heavy workloads (the PCs at work are bad and we work at a radiology department hence we like to run the 3D image calculation from the 256 slice low dose CT at our own PCs when doing homeoffice - the PCs at work are old enough to still run windows 98...). Can you have issues with a chip? Yes, my 2500k back in the day had BSOD issues even though it is supposedly one of the most stable processors from that time, simply lost the chip lottery back then. But I don't really see any issues with current ryzen 3000 processors. Don't know anyone who had issues with it either and we have a lot of "gamer friends" (kinda comes with working at a radiology, basically the only geeky part of hospitals where we live - makes it a lot of fun to work there as noone is opposed to have fun with an after work karaoke party or similar things - internists and other doctors we get to see sometimes are rather boring in comparison).

There are still reasons to go for intel but it's definitly not lower heat or stability anymore. Stability is the same and heat is far worse with 10th gen now and slightly worse with 9th gen compared to Zen 2. I still hope Zen 3 will be as much of a boost as promised in the roadmaps (Zen 2 delivered on the promises, but you can never know). At that point the pressure from both sides will be large enough to see massive price drops on both sides (like we saw with the cut in half prices from intel a short while ago).

Another thing: The prices of the 10th gen from the current presentation is a "cost per unit when buying 1000 units". The real market price normally is 15-20% higher at first. Intel is using the same trick they use for the TDP values. They go for the base instead of boost clock TDP and they go for the "if you buy 1000 units" instead of "recommended retail price". Ontop of that they like to present their single core boost clock as the "boost clock" instead of all core boost. This time they at least showed both in their presentation. AMD uses similar tricks but not to such an extent. Hence looking just as the numbers on paper instead of actual benchmarks (power consumption etc) might make you believe that intel would be better in that regards while it actually isn't (as shown in the graph above - you can find many more tests and benchmarks rather easily though).


----------



## londiste (May 5, 2020)

Earthplayer said:


> They go for the base instead of boost clock TDP and they go for the "if you buy 1000 units" instead of "recommended retail price". Ontop of that they like to present their single core boost clock as the "boost clock" instead of all core boost. This time they at least showed both in their presentation. AMD actually gives the real values for TDP at sustained boost, all core boost clock and rrp (although they use their own tricks for other values).


Are you saying both AMD and Intel are evil or what?
AMD similarly uses base clock for TDP, does not give real value as TDP for sustained boost and does not use all core boost clock for marketing but uses one core boost clock.

As far as prices go, Intel's retail prices have pretty much always been at a same or similar level as their RCP. 9000-series is kind of a fluke with excessive prices across the board for a while.


----------



## Earthplayer (May 5, 2020)

londiste said:


> Are you saying both AMD and Intel are evil or what?
> AMD similarly uses base clock for TDP, does not give real value as TDP for sustained boost and does not use all core boost clock for marketing but uses one core boost clock.


Yes I do say that both use misleading tactics (or as you put it "evil"). AMD uses minmal sustained boost clocks for TDP though (below max boost clocks but higher than base clock). This also shows in full load power consumption in the table I posted. Both Intel and AMD eat more than their TDP suggests - but AMD is closer to the real life power consumption / heat production because it at least uses minimal sustained boost to measure it rather than base clock.

This is why only looking at the TDP you
1) Can't say that intel would be "hotter" than AMD as "braggingrights" suggests (it simply is not true anymore - it used to be true but that time is in the past)
2) Can't decide on a cooling solution based on TDP if you actually want sustained max boost clocks. No matter if you go AMD or Intel. Most consumers without any knowledge buying a PC simply see "95TDP" and go for a 95 TDP rated cooling solution. It works for base clocks but the sustained boost clocks will be limited with it which they can only know if they had decent knowledge about it which most consumers won't have.

I really hate such shady tactics. Both Intel and AMD have been found guilty and payed fines for bad practices in other areas. Like Intel paying off PC and Laptop manufacturers a few years ago to not include AMD in their standard lineups. Intel is worse when it comes to overstating what their chips have to offer at the current time though. Some things were just opportunistic (like the pricing of the 9th gen and shortly after ryzen release cutting them in half) while others are simply distasteful as the examples I mentioned in my other post.

And on the RCP thing: Would be great if 9th gen was just a fluke and 10th gen will be cheap. Actually competitive pricing would mean even cheaper Zen 3 chips as AMD wants to keep the edge for price/performance. There are a lot more Intel users due to the lack of decent CPUs from AMD in recent years (before Ryzen). To make people who tend to go for the company with the thing they recently owned consider actually switching to AMD they want to stay ahead when it comes to price/performance. (it's sad how many people simply buy the product from x because they had product z from the same company in the past instead of doing a comparison before spending their money). That competition would be great!


----------



## Braggingrights (May 5, 2020)

Earthplayer said:


> Which tests are you talking about? You sound like the biggest intel fanboy with all your lies and argumentum ad hominem in almost every post. You are either uninformed or try to troll that guy that is almost as bad as you when it comes to argumentum ad hominem. It is very sad to see you attack each other like that even though it has nothing to do with hardware or the processors at that point.
> 
> Chips like the 3600/3600x/3700x consume a lot less power compared to their intel counterparts. Except for idle consumption, intel pulls slightly ahead there. (doesn't matter for us as we don't use our PCs just to browse the web - always gaming or working else turning it off) And the new 10th gen chips from Intel actually have a TDP for their baseclock and not boost. (this is why the "forced 95w TDP mode" shows a 3.5ghz clock compared to the 125w TDP at 3.8ghz) The turbo clock power consumption (power consumption = heat) goes above 200w on the 10700k. And that's just for the chip without anything else. You say AMD chips would be hotter than Intel chips which is simply not true.
> 
> ...


Low quality post by braggingrights, avert your eyes kids


----------



## Earthplayer (May 5, 2020)

Braggingrights said:


> So many words for: AMD loses gaming again
> 
> I stopped reading after 'fanboy', it betrays a certain intent, so don't worry, I can guess most of it


You just pretty much proved to everyone that you are just an uninformed fanboy just with that one comment, thanks for making it easy to dismiss anything you say as blatant lies without any data to back it up. You don't even know the facts and dismiss any data proving your wrong, insult people constantly and spread lies (likes Zen 2 producing more heat than 9th gen intel). You should be ashamed of yourself.


----------



## Kursah (May 5, 2020)

Let's stop with the drama and name-calling please. It degrades the topics and devalues anything anyone has to say. Feel free to review our forum guidelines before posting further, thanks!


----------



## Earthplayer (May 7, 2020)

To put the whole heat argument to rest:








When it comes to lower power consumption, lower heat and hence lower noise level AMD wins.  Don't trust TDP values on the box of companies. Intel and AMD both don't use the real power consumption values for those but Intel is a lot further away from the real watt usage than AMD hence on paper it looks like AMD would run hotter but in reality, in games, in benchmarks, in software AMD runs cooler than Intel.


----------



## Braggingrights (May 7, 2020)

Apollo 11 was pretty hot too, LHC output is insane, Ferrari V12 phew

If only their thermals were better they'd be pretty impressive products


----------



## ToxicTaZ (May 7, 2020)

Earthplayer said:


> To put the whole heat argument to rest:
> 
> 
> 
> ...



Just try running any Ryzen 3000 CPU at Intel 5GHz+ club speeds and see what happens? 

Yes Intel H5 LGA 1200 socket platforms with broken PCIe 4.0 is finally given AMD 3000/4000 series a clear win! 

All this changes with Intel H6 LGA 1700 socket platforms 12th/13th generations....with 16 cores big.Little and 10nm++ & 7nm+ is automatic win in power development once again! Even AMD upcoming 5nm AM5 platforms 5000/6000 series won't be able to complete against upcoming Intel H6 LGA 1700 socket..... 

It's going to be a clear winner for Intel new architecture in the upcoming Intel Alder Lake and Intel Meteor Lake (12th & 13th generations)


----------



## Earthplayer (May 7, 2020)

Braggingrights said:


> Apollo 11 was pretty hot too, LHC output is insane, Ferrari V12 phew
> 
> If only their thermals were better they'd be pretty impressive products


You started the heat argument to begin with and said AMD was running hotter. Glad to see you accepted that you were wrong though.  Your comparisons are honestly rather insane though.

Anyways, I really hope Intel puts out some interesting processors with their 11th gen. Would be great to see both sides compete strong enough to see the prices drop further.


----------



## Braggingrights (May 7, 2020)

Earthplayer said:


> You started the heat argument to begin with and said AMD was running hotter. Glad to see you accepted that you were wrong though.  Your comparisons are honestly rather insane though.
> 
> Anyways, I really hope Intel puts out some interesting processors with their 11th gen. Would be great to see both sides compete strong enough to see the prices drop further.


What? based on that? even they said their result was meaningless 

AMD hot, unstable and finicky in my experience, but some people are more forgiving of those things to save a few bucks, that's cool too


----------



## ToxicTaZ (May 7, 2020)

Earthplayer said:


> You started the heat argument to begin with and said AMD was running hotter. Glad to see you accepted that you were wrong though.  Your comparisons are honestly rather insane though.
> 
> Anyways, I really hope Intel puts out some interesting processors with their 11th gen. Would be great to see both sides compete strong enough to see the prices drop further.



Intel 11th gen "Rocket Lake" is still 14nm++ made by Samsung said to be 12 cores and a working PCIe 4.0 memory controller for the H5 LGA 1200 socket. 

10nm++ Intel 12th gen "Alder Lake" is Intel next generation architecture! (Golden Cove cores) then 7nm+ Intel 13th generation "Meteor Lake" is basically Alder Lake Refresh to my knowledge. 

Intel H6 LGA 1700 socket PCIe 5.0 with DDR5 and USB-4 and all the other goodies... VS AMD AM5 socket PCIe 5.0 with DDR5 and USB-4


----------



## Earthplayer (May 8, 2020)

ToxicTaZ said:


> Intel 11th gen "Rocket Lake" is still 14nm++ made by Samsung said to be 12 cores and a working PCIe 4.0 memory controller for the H5 LGA 1200 socket.
> 
> 10nm++ Intel 12th gen "Alder Lake" is Intel next generation architecture! (Golden Cove cores) then 7nm+ Intel 13th generation "Meteor Lake" is basically Alder Lake Refresh to my knowledge.
> 
> Intel H6 LGA 1700 socket PCIe 5.0 with DDR5 and USB-4 and all the other goodies... VS AMD AM5 socket PCIe 5.0 with DDR5 and USB-4


I thought they finally figured out their issues with 10nm? Those roadmaps seem to change every few months now instead of every year... Zen 3 should be cheap as it's still on AM4 but this scares me for Zen 4. I hope AMD will still stay true to their cheap pricing and make Zen 4 equally cheap even if Intel has nothing to offer to compete at that point. But maybe Intel has a trick up their sleeves, who knows. Will be interesting either way with DDR5, USB-4 and PCIe 5.0 coming up for consumer products. I hope they improve the lithography for the motherboards though. Else passive cooling PCIe 5.0 will be impossible. Not that we will need PCIe 5.0 any time soon for general consumers considering GPUs don't even need the full 3.0 x16 right now. But with the first consumer motherboards with 3.0 back in the day we didn't see any new 2.0 boards anymore after a short while even though it was plenty enough back then. But who knows. Maybe we see 4.0 alongside 5.0 for many years - cheap and medium tier boards with 4.0 and high tier with 5.0 only. Still want those lithography improvements. 12nm for the newer boards right now is a nice improvement over the old 28nm but there is still more than enough headroom for improvement even with current tech. After the 5 years of basically no improvements (compared to the jumps we saw before ) it's great to see huge steps being made in the processor and motherboard market though.


----------



## ToxicTaZ (May 8, 2020)

Earthplayer said:


> I thought they finally figured out their issues with 10nm? Those roadmaps seem to change every few months now instead of every year... Zen 3 should be cheap as it's still on AM4 but this scares me for Zen 4. I hope AMD will still stay true to their cheap pricing and make Zen 4 equally cheap even if Intel has nothing to offer to compete at that point. But maybe Intel has a trick up their sleeves, who knows. Will be interesting either way with DDR5, USB-4 and PCIe 5.0 coming up for consumer products. I hope they improve the lithography for the motherboards though. Else passive cooling PCIe 5.0 will be impossible. Not that we will need PCIe 5.0 any time soon for general consumers considering GPUs don't even need the full 3.0 x16 right now. But with the first consumer motherboards with 3.0 back in the day we didn't see any new 2.0 boards anymore after a short while even though it was plenty enough back then. But who knows. Maybe we see 4.0 alongside 5.0 for many years - cheap and medium tier boards with 4.0 and high tier with 5.0 only. Still want those lithography improvements. 12nm for the newer boards right now is a nice improvement over the old 28nm but there is still more than enough headroom for improvement even with current tech. After the 5 years of basically no improvements (compared to the jumps we saw before ) it's great to see huge steps being made in the processor and motherboard market though.



AMD AM5 socket with (Zen4 & Zen5) on first generation 5nm. Intel has brand new architecture against it!! Intel brand new double the IPC cores "Golden Cove" in the 3rd generation 10nm++ (Alder Lake) and second generation 7nm+ (Meteor Lake) on New H6 LGA 1700 socket.

Intel purity much is going to wipe AMD AM5 5nm with Alder Lake and Meteor Lake with both based upon 16 cores big.Little and on high yielding 10nm++ & 7nm+ will automatically win power department!

Intel 12th and 13th generation will be similar to the 4 Cores 2700K & 3770K was but with 16 cores big.Little architecture is what's coming.

AMD AM5 will also lose backwards compatibility with AM4.... In fact even some Zen 3 AM4 is spotted having backwards compatibility issues too!









						AMD B550 Chipset Detailed, It's Ready for Zen 3, Older AM4 Motherboards not Compatible
					

In their briefing leading up to today's Ryzen 3 3100 and 3300X review embargo, AMD disclosed that its upcoming "Zen 3" 4th generation Ryzen desktop processors will only support AMD 500-series (or later) chipsets. The next-gen processors will not work with older 400-series or 300-series chipsets...




					www.techpowerup.com


----------

