• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel 10th Generation Comet Lake Desktop Processors and 400-Series Chipsets Announced, Here's what's New

Low quality post by birdie
Me thinks that @birdie is an Intel shill.

I wonder how stating facts and nothing but facts can be considering "a shill" when I'm also rocking a system based on the Ryzen 7 3700X right freaking now.

I now AMD fans hate facts and love illusions or "potential" but I'm sorry I don't buy any of this.
 
This is an enthusiast site and those who want to game with modern titles will find 4c/8t hold back some games at 1080p and 1440. But we arent talking peoples salaries and 1st/3ed world bullshit... that's just ranting.

That said, I dont like the core/thread war either that amd started. 16c/32t in mainstream is a joke. In fact, I didnt like the 12c/24t part either. 8c/16t is plenty for the next few years.
 
Last edited by a moderator:
I wonder how stating facts and nothing but facts
Because you sound like Intel's marketing department from a few years ago.

That said, I don't like the core/thread war either that AMD started. 16c/32t in mainstream is a joke. In fact, I didn't like the 12c/24t part either. 8c/16t is plenty for the next few years.
I'd have to agree with you on that. The only thing that this so-called "core/thread war" has done has made lower-end products not only cheaper but also a better buy. And in the end, that's all that really matters. People now are getting more for their money.
 
Because you sound like Intel's marketing department from a few years ago.


I'd have to agree with you on that. The only thing that this so-called "core/thread war" has done has made lower-end products not only cheaper but also a better buy. And in the end, that's all that really matters. People now are getting more for their money.
Absolutely. My only concern is people generically thinking more c/t is better. They are... but if you can use them. For gamers, 4c/8t in 2020 is already long in the tooth... I wouldnt go less than 8/8...6/6 if you didnt have a choice.
 
Yeah. Before AMD came back and kicked Intel's ass their lower-end Core i3 chips were all nothing more than 2C4T chips, now they're 4C8T chips at nearly the same price. This is double the performance at nearly the same cost, the consumer wins.
 
Here is me theoretically a newbie. I want to spend $500 and build a PC for Gaming. I know that I will be upgrading my PC as time goes on. Do I go with the !0th Gen I3 and get a Graphics card (X570 or 1650) or will I go for an AM4 build using a 3400G? The arguments can be had for both sides, neither is a bad choice. However if you look at the argument objectively AM4 offers way more flexibility to the end user than the current Intel lineup. We don't even have B550 yet and there will be another launch in October??? Even the boldest Intel fanboy must be shaking in their boots at the proposition of Zen4 and if it is compatible with X370 or X470 (which would in turn mean B450) and potentially B350......
 
If I'm gaming, clearly intel as you framed it... both are 4c/8t cpus, right? The 1650 and rx 570 are faster than that igpu..

Zen 4 wont be compatible with anything lower than x470 I'd imagine. AMD said through 2020, and zen4 is years away.
 
Intel Press Conference

"Here is what's new, everything is Up To just a hair better than what we did last time. Maybe. If its not too hot"

Seriously I'm gonna call this company Up To going forward.
 
Seriously I'm gonna call this company Up To going forward.
Well, another user on my ignore list for discrimination... :p

I kid...

But seriously... all companies are 'up to' marketers! Think of it like fractions and common denominators....do to one side what you do to the other! Lol.. dont hate... dont discriminate....they all do it. :)
 
Well, another user on my ignore list for discrimination... :p

I kid...

But seriously... all companies are 'up to' marketers! Think of it like fractions and common denominators....do to one side what you do to the other! Lol.. dont hate... dont discriminate....they all do it. :)

There is no hate, its just sad and funny all in one go :p And Intel is doing a fine job making itself look silly lately, so yeah, they're in the crosshairs now.

Btw if you go by those criteria I'm sure this is already the quietest forum on earth for you haha
 
Absolutely. My only concern is people generically thinking more c/t is better. They are... but if you can use them. For gamers, 4c/8t in 2020 is already long in the tooth... I wouldnt go less than 8/8...6/6 if you didnt have a choice.

I totally agree with you the 3300 could be a surpriging chip if it let's all cores clock as high as touted on 7nm.
If I'm gaming, clearly intel as you framed it... both are 4c/8t cpus, right? The 1650 and rx 570 are faster than that igpu..

Zen 4 wont be compatible with anything lower than x470 I'd imagine. AMD said through 2020, and zen4 is years away.


I am so sorry I have had a couple Hollandia and enjoyed the offerings of the Government of Canada I meant Zen3 (To be honest it is all a little confusing).
 
There is no hate, its just sad and funny all in one go :p And Intel is doing a fine job making itself look silly lately, so yeah, they're in the crosshairs now.

Btw if you go by those criteria I'm sure this is already the quietest forum on earth for you haha

The polarizing nature of so many users this forum is incredibly off-putting (surely many feel the same about me getting information out, lol)... no doubt. My threads look like swiss cheese sometimes, lol. But hey, if it isn't mitigated... we got to do it. ;)

Yeah, just having a laugh over the 'hate', thing. :p

Seriously though, everyone markets in an 'up to' manner. This has nothing to do with Intel or whoever... THAT is marketing in the 21st century, sadly.
 
No, users should not buy PCs with as many cores as humanly possible because unused cores are nothing but wasted money. People should always buy what's best for them (in terms of the bang for the buck) for their budget. I do understand that most TPU users are tech-enthusiasts who love to have overpowered PCs because you do it for boasting rights but that's not how the world works! Many people save on food and clothes to be able to buy a PC and you're insisting they should go e.g. buy something like Ryzen 7 3700X? Or companies which buy thousands of PCs for their workers? Why?? All these people will be just fine with Core i3 10300 for the next 15 years. Yes, 15, because I had an Intel Core i5 2500 based PC until August 2019 and it still works perfectly. I replaced it not because I needed MOAR cores or speed but because I wanted a new PC for a change.

Also, please let me remind you about AMD FX-8000 / 9000 CPUs which had MORE cores but ran slower in absolute most tasks than Intel CPUs with twice as fewer cores. So, your argument about having MOAR cores goes out of the window.

And since we've just established that MOAR cores are not that essential we come back to square one.
  • Old bad Sky Lake at 5.3 GHz performs faster than any non-OC'ed AMD CPU in existence in absolute most tasks.
  • AMD does win when MOAR cores are getting used due to power throttling on the Intel side because you can go only so far with power hungry 14nm cores.
  • Intel does have CPUs with much better IPC than Sky Lake: Ice Lake (~18% IPC uplift), Tiger Lake (+15% IPC uplift vs. Ice Lake).
Lastly many people say new games will utilize MOAR cores, which means slower but MOAR cores are better than faster but fewer cores. This is too often not true. Let me explain why:
  • Most game engines have a master thread which synchronizes all other threads load and if this master thread becomes overutilized your additional cores are going to waste.
  • CCX complexes in AMD CPUs mean there's a certain amount of delay in communication between cores which means games have to be specially coded which adds complexity and some game companies will simply not do this work because there's this vendor, which is being mocked at constantly, Intel, which doesn't have inter-CPU cores communication issues. AMD has actually realized that as well and Zen 3 is rumored to have 8-core CCX complexes which solves the issue.
  • A lot of games don't actually need that many cores because they are not complicated enough and programmers have no tasks to run on additional cores. In fact less than 5% of games in 2020 fully utilize more than 6 cores which means Intel Core i9 9700 is doing its job just fine or most four-core CPUs with HT.
Over and out.



Are you following me?

Let me quote myself again: "And 3400G is quite slower CPU-wise than Core i3 10100 because it's Zen+, not Zen 2."

-Old bad Sky Lake at 5.3 GHz performs faster than any non-OC'ed AMD CPU in existence in absolute most tasks.

is this the slide we are looking for?

Intel%20Computex%20Kickoff%20May%2026%202019-page-016_575px.jpg


Do you also work for them?...

- Intel does have CPUs with much better IPC than Sky Lake: Ice Lake (~18% IPC uplift), Tiger Lake (+15% IPC uplift vs. Ice Lake).
Where? do they have a DESKTOP CPU with better IPC then AMD right now? will they have in the following year the year next to it?
If the answer is NO, then whats the point to even bring it up?, Zen 3 is around the corner, rumored to bring a big IPC lift once a gain after they refined the CCX/CCD layout, and with Zen 3 we are talking about September-October time frame.
If Rocket-Lake-S is lower core count, lower frequency but with higher clocks.. what performance benefit will "But my 5.3GHz CPU is best cuz Marketing to me so" crowed will see? they would definitely don't care about PCI-E 4.0 and better IGPU. "Tiger-Lake" when is that? Late 2020 on laptops and ~2021 on desktop? So it will compere with zen 4.....


-
All these people will be just fine with Core i3 10300 for the next 15 years. Yes, 15, because I had an Intel Core i5 2500 based PC until August 2019 and it still works perfectly. I replaced it not because I needed MOAR cores or speed but because I wanted a new PC for a change.

Sorry, but that's a funny comment. you base the fact a new CPU will be "good" for 15 years and you base that on the fact you changed your CPU after 9 Years stating it "wasn't because of performance".
You know what CPU were available 15 Years a go? Athlon 64 x2 and Pentium D, do you really want to use one of those on a modern operation system with a modern browser? not even gaming. the answer would be you don't. in another 6 years, people will look at your beloved i5 2500 the same way as people look today on a c2d, it old, slow and shouldn't be used.
You should also remember that for the most part software takes time to catch up to hardware. 4c/4t was the performance on the mainstream front for a long time, you would not want to target software for people running on 8c/8t CPU's when only 1% of people have those. so you target for the lower end. a 2500 is slower then a modern i3, if you don't game that's fine but lets not pretend that a 2500k is a "decent" gaming CPU in 2020. I had an i5 3470 from the moment it launched. I had it for several years until I changed it to an E3-1270v2 (~i7 3770) I grabbed for cheap on ebay, frame-times in games were much more consistent (and that's with a mid-range GTX970) after the swap.
Every few years the software catches up with hardware improvements. and it will happen sooner then later, in 2010 we said "games don't need more than 2 cores", 3 years later when I swapped my aging E8400 C2D to an i5 3470 (in late 2012 when it launched) the difference was night and day. today people repeat the broken record that games don't use more the 4c/8t, so in a year it would change again to "but games don't use more than 8c8t and so on". Technology is going forward and you can't expect any piece of hardware to stay relevant for ever. even when it comes to office pc's an i5 2500 is starting to show its age, believe me I know having the "pleasure" to use one inside an hp generic sff box on my work pc (thankfully with an SSD). and while it is performance are decent for its age. its nothing to write home about.
 
CCX complexes in AMD CPUs mean there's a certain amount of delay in communication between cores which means games have to be specially coded which adds complexity and some game companies will simply not do this work because there's this vendor, which is being mocked at constantly, Intel, which doesn't have inter-CPU cores communication issues. AMD has actually realized that as well and Zen 3 is rumored to have 8-core CCX complexes which solves the issue.
I'm not sure if I read you correctly, but coding games to be "optimized" for low-level core design "shortcomings" like this is approaching the impossible. And even if so, this would have to be managed by the OS kernel. But regardless, this would be working around poor design choices, and would also require dozens of specially crafted schedulers in each OS. I believe by principle this would be a very bad idea, and would lead to loads of poorly maintained code. Writing good software is complex and messy enough as it is, the last thing we need is more piles of workarounds.

Regarding the Zen 2 design with 2 CCXs per die, this isn't a problem (except for perhaps a few edge cases), and especially not for gaming. Games do as little core to core synchronization as possible, because it's expensive regardless of CPU design, and there are also cascading problems due to OS scheduling overhead. While I do expect Zen 3 will bring some improvements with its 8-core CCXs, but this will be due to more cores sharing L3 and other design improvements. Games are on the other hand very sensitive to memory latency, and within a single frame rendered, a thread will do much more memory accesses than thread to thread communication. What Zen 3 brings in terms of memory controller improvements and core front-end improvements will be deciding factors for gaming performance.

Zen(1) did however have issues with the larger Threadrippers as we know, practically making them useless for gaming. That's real bottlenecking.
 
$70 is how much people are earning in some African countries in a month. Sometimes it helps to leave the cozy vacuum of your rich american life and realize there's a world outside with actual people and for many of them $70 is a ton of money.
This post makes me leave this thread.
Trolling is one thing.
Bizzare pseudoarguments/comparisons from mental asyllum in Star Trek universe is whole different sport.
 
Check out.
4000 series...
Mainboards...

If you dare...

I hope they come with fire extinguishers... :D

You mean like how fast AMD 4XXX laptops were going to be?

Here's two charts for you, PCMark - overall system performance.
The top one is top performers with the highly vaunted king of laptop chips according to the pundits, er tech sites, the 4800HS.
The chart below it is top performers with an i7-9750H.

And yes I know it's not the CPU itself, but I don't make my own chipsets and drivers. This is what you can expect from an actual laptop right now.

AMD 4800HS:

AMD4800HSLaptopsPCMark.JPG



Last years 9570H Intel based laptops scoring about 20%+ higher:

Intel9750HLaptopPCMarks.JPG
 

Attachments

  • ProcessTechComparison2020.JPG
    ProcessTechComparison2020.JPG
    75.7 KB · Views: 112
Last edited:
The top one is top performers with the highly vaunted king of laptop chips according to the pundits, er tech sites, the 4800HS.
Nope, the current AMD top chip is 4900HS also their best mobile chip is 4900H non S. There's also the fact that Intel chips easily pull 90W in many high/top tier notebooks & that's not really mobile category IMO.
 
Nope, the current AMD top chip is 4900HS also their best mobile chip is 4900H non S. There's also the fact that Intel chips easily pull 90W in many high/top tier notebooks & that's not really mobile category IMO.
Lets not forget PCMark score is also depended on the GPU, and currently the best on a ryzen laptop is a 2060MaxQ compared to 2080's on the intel side.
 
When did mobile performance get involved in a thread about desktop parts?
Come on, people, try to maintain some civility and stick to the actual subject at hand, this flame war is getting tedious.
 
name brands mean nort to me, i want the best gaming experience.
if its Intel its intel, if its AMD,im actually surprised.
 
Last edited:

I don't see any facts, benchmarks or valid data to continue to argue with you. You also didn't really refute any of my arguments and instead veered so far away as to start talking about CPUs from the mid 00s for the lack of better arguments. How does this history tidbit relate to Comet Lake CPUs exactly?

It's a well known fact that CPU performance increases in single threaded mode over the past decade have been minimal (except for AMD but they trailed Intel very hard). In the 90s the performance grew by up to 30% annually but we now live in the 20s. Also there are tasks which run almost at the same speed on Sandy Bridge and Ice Lake when both CPUs are running at the same frequency - the CPUs which are eight years apart. That was unthinkable before the 10s. Don't believe me? Run Fritz Chess Benchmark and see for yourself.

And yes, the picture you've showed perfectly represents normal tasks for > 95% of average people out there. The sad reality which AMD fans don't really like is that multi-core CPUs are mainly necessary for professionals. I didn't know it existed as I've never seen it before. In the top 20 only WinRAR utilizes many cores but out of many dozens of people that I know personally zero run WinRAR regularly.

OBS Studio, ranked 151(!), is surprisingly used by exactly 5% of people, so my estimates are quite on spot. Also, this app works miles better when you have a device capable of HW video encoding, so having more cores even in this application becomes moot at best.
 
Last edited:
Here are some facts.

Top benchmarks recorded for PCMark 10, Firestrike Extreme, and Time Spy Extreme.


PCMarkFireStrikeTimeSpyTopScores.JPG
 
Win gaming, win the world... is it fair?

If you wanted fair you picked the wrong universe.
 
This release is another version of 2015 Skylake. The power draw is going to be ugly.

14nm in 2020 is a sad state of affairs. I would avoid any of these relic Intel CPUs until they can move to 7nm in 2022.

Wow, groupthink is a thing it seems.

So here's a little thought starter.

How does the i5-10400 (65W) stack up to the 3600 (65W)?

How about the i5-10500 (65W)?

How does the i5-10600 (65W) stack up to the 3600X (95W)?

These are all 6C/12T CPUs now and will co-exist at comparable price points. My thought is that in the midrange, Intel's new chips are going to clock AMD's 3XXX offerings (pun intended).

And everyone will still buy Ryzen, because of the lower power draw, much cheaper prices and superior multi-threaded performance.

Just look at Amazon.com best selling processors. 9 of the top 10 is Ryzen I'm afraid.
 
Back
Top