• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

14nm 6th Time Over: Intel Readies 10-core "Comet Lake" Die to Preempt "Zen 2" AM4

aPJEz1u.jpg
it works for both ways...
 
Im on this hardware stuff for almos 2 decades and I never seen high end pc components being so premium.
I'll admit that GPUs have never cost so much. But CPUs regularly cost over $500 when they are released. FFS the FIRST AMD "FX" chips (back in the Athlon 64 days) cost over $900.
 
base clock will be 3.2ghz to get those 95W TDP ?
 
base clock will be 3.2ghz to get those 95W TDP ?
Lower base clock in exchange for more cores is nothing new. However, all cores turbo is where it's at.
That said, 10 cores within 95W would be quite a feat, I expect these to need a little more than that.
 
Lower base clock in exchange for more cores is nothing new. However, all cores turbo is where it's at.
That said, 10 cores within 95W would be quite a feat, I expect these to need a little more than that.

Even their 6 cores royally boost beyond the stated TDPs, so yea, 'just a little' :P
 
Intel Readies 10-core "Comet Lake" Die to Preempt "Zen 2" AM4
Intel, as things are, can not pre empt AMD; they're playing defensive.
 
considering the 8core 16threads i9 9900K costs over 600 bucks
I dont want to know how much they will charge for that refresh-refresh-refresh 14nm 10core cpu
 
Even their 6 cores royally boost beyond the stated TDPs, so yea, 'just a little' :p
TDP is specified at base clocks, let's not have this discussion again. We're already off topic.
 
Comet Lake? Really? Well, I hope it doesn't crash right on top of them...

I'll admit that GPUs have never cost so much. But CPUs regularly cost over $500 when they are released. FFS the FIRST AMD "FX" chips (back in the Athlon 64 days) cost over $900.

Official launch prices for the Core i7-x7xx-K models had mostly been around $350, at least... Although before AMD woke up, if you wanted 6 cores or more, you had to be prepared to give up quite a bit more, usually 999 dollars for the CPU...

The problem I see here is these don't work on current motherboards. And if you buy one of these, you know Ice Lake is (eventually) coming and will require yet another motherboard. The CPUs themselves are fine. Everything around them is not, however.

How long has it been since Ice Lake appeared in roadmaps and stuff? It feels like it should have been released like years ago...
 
How long has it been since Ice Lake appeared in roadmaps and stuff? It feels like it should have been released like years ago...
2-3 years. I lost count.
 
Although before AMD woke up, if you wanted 6 cores or more, you had to be prepared to give up quite a bit more, usually 999 dollars for the CPU...
Right, and AMD did the same when they had superior CPUs in the past. Even Intel had a model between $350 and $999 during i7's glory days.
 
Right, and AMD did the same when they had superior CPUs in the past. Even Intel had a model between $350 and $999 during i7's glory days.
I'm still not sure why this gets brought up so often.
You lead by a sizeable margin, you get to charge a premium. That's how business works. End of story.
 
Although before AMD woke up, if you wanted 6 cores or more, you had to be prepared to give up quite a bit more, usually 999 dollars for the CPU...

Its not entirely true, my six-core i7 5820K cost me 350€ (+/- $350) in 2014.
 
Oh look, oh look, this guy can't even look back to 144Hz but can enjoy games at Sub-30FPS on consoles. Bad liar.

Say what? <3

dh3Lf4N.png


News flash, that 500 dollar CPU was never a requirement to play games. Not even for your magical 240hz - that is just a surrealistic FPS target consisting of 40% marketing and 60% placebo that you've fallen for. Evidence in the fact 'you can't look back to 144hz'... its sad and funny all at the same time. You have no idea. Just because a number's higher doesn't mean it means anything. Its the same as with 4K monitors and how half the gamurs play on them at a downscaled 1080p because 'woopsie, games do really get tougher on my GPU over time'. There is always going to be something to blow insane amounts of money on, with questionable benefits. Wake up... You can still easily game on a reasonable budget.

You remind of those guys that once said "human eye can´t see more than 24 fps". And also those that said 120hz was useless as no one ever needs more than 60hz and so on... I still have my 144hz LG and I do a lot of tests between it and my Dell AW2518 and the different is HUGE, to the point that I can now see the mouse pointer lagging on windows desktop. You can live on denial if you want. 240hz is objectively superior to 144hz and everone can notice it. And if you use 240hz for a long time you will not accept 144hz. Also you don´t need 240fps to take advantage of 240hz. Research on Blur Busters forum and website and read the technical analysis about the subject made by Chief.
 
Say what? :love:

dh3Lf4N.png




You remind of those guys that once said "human eye can´t see more than 24 fps". And also those that said 120hz was useless as no one ever needs more than 60hz and so on... I still have my 144hz LG and I do a lot of tests between it and my Dell AW2518 and the different is HUGE, to the point that I can now see the mouse pointer lagging on windows desktop. You can live on denial if you want. 240hz is objectively superior to 144hz and everone can notice it. And if you use 240hz for a long time you will not accept 144hz. Also you don´t need 240fps to take advantage of 240hz. Research on Blur Busters forum and website and read the technical analysis about the subject made by Chief.

Cool story, but your point was that gaming is unaffordable these days and then you bring in surreal demands in terms of FPS targets that less than 1% of the audience cares about. The vast majority of games cannot even keep half that framerate on the fastest CPU. You're comparing ultra high end demands to mid-range price points and then say 'its not fair everything costs so much' and/or 'Why can't Intel make a faster CPU'.

So yeah, its quite hilarious to see you didn't get that memo right here. So great, you can now notice your mouse pointer lagging on the desktop at 144hz. I don't. Who's experience is really better now? :)Diminishing returns is a thing, look it up. 24 fps or 60 or 144 or 240, its quite a stretch. There will always be a next best thing, that doesn't mean its something you'd need.

You know I'm on a 120hz panel too. I'm sure that a higher refresh rate will still be smoother, but the investment versus the payoff simply isn't worth it, and consistency suffers. Its better to have a slightly lower but fixed FPS/refresh than 'as high as possible' while only hitting it rarely. Why? So I can use my strobing backlight, which helps motion clarity far more than even 480hz would. I don't spend my days counting pixels or pushing my nose into my panel so I can spot the smoothness of my 240hz mouse pointer. I play games :)
 
Last edited:
TDP is specified at base clocks, let's not have this discussion again. We're already off topic.
I love the way you agree completely with his statement but are so determined to defend Intel that you managed to make it combative and stupid.

Intel's shit runs hot these days. Way hotter than their marketing implies or even outright states it actually should. Quite why you're so married to the idea of minimising that is beyond me, but here we are again - another piece of news that makes Intel look terrible, and you're on deck with damage control.

Right, and AMD did the same when they had superior CPUs in the past. Even Intel had a model between $350 and $999 during i7's glory days.
Your point ignores that "back in the day" Intel did not segment their products into "Mainstream" and "HEDT" segments.

The first chips Intel formally denoted as separate from "Mainstream" platform were the Sandy Bridge E chips, the 3960X in specific, on LGA2011 with an RRP of $1059. The 2700K was $339 - Before that, the high end chips existed on the same platform as the lower end chips - the 980X was also $1059 but it sat in exactly the same motherboards as the i7 920, which was $305 and was a highly recommended budget/overclockers chip for what we now refer to as "mainstream" users.

Since the inception of i7 branding, RRPs for mainstream CPUs have gone from $305 to $499, (63.61% increase over 7 years), and "HEDT" processors have gone from $1059 to $1999 (88.76% increase over 7 years).

In the same 7 year period, the dollar has only inflated from $100 to $112.42, meaning that in all segments, Intel's price increases have outstripped inflation.

On top of that, the end of Moores law, the stagnation of corecounts and clocks, means that while prices have been rising more quickly, performance has been rising more slowly.

In other words, for the last 7 years, Intel has been fucking us all and even Intel fanboys should be glad that AMD are finally stepping up to put pressure on them to be more competitive in all areas.
 
Last edited:
Cool story, but your point was that gaming is unaffordable these days and then you bring in surreal demands in terms of FPS targets that less than 1% of the audience cares about. The vast majority of games cannot even keep half that framerate on the fastest CPU. You're comparing ultra high end demands to mid-range price points and then say 'its not fair everything costs so much' and/or 'Why can't Intel make a faster CPU'.

So yeah, its quite hilarious to see you didn't get that memo right here. So great, you can now notice your mouse pointer lagging on the desktop at 144hz. I don't. Who's experience is really better now? :)Diminishing returns is a thing, look it up. 24 fps or 60 or 144 or 240, its quite a stretch. There will always be a next best thing, that doesn't mean its something you'd need.

You know I'm on a 120hz panel too. I'm sure that a higher refresh rate will still be smoother, but the investment versus the payoff simply isn't worth it, and consistency suffers. Its better to have a slightly lower but fixed FPS/refresh than 'as high as possible' while only hitting it rarely. Why? So I can use my strobing backlight, which helps motion clarity far more than even 480hz would. I don't spend my days counting pixels or pushing my nose into my panel so I can spot the smoothness of my 240hz mouse pointer. I play games :)

Strobing is an awful technology that not only introduces flickering, wich is bad for some eyes (including mine) but also adds double the input lag. I have strobing on my 240hz panel and I never use it. You don´t get my point. Is not about having 240fps! Having a 240hz panel and a framerate that fluctuates from 120 to 200 is already smoother than any 120hz/144hz panel even with Gsync/Freesync, because the monitor can actually display every frame without needing to lock framerate. The Motion Clarity is amazing, response time near 0,5 (closest as it can get to Oled or CRTs), and the input lag is half of 120hz! Even if you run 60fps at 240hz it will have half input lag compared to 120hz (just an example, I always try to run high framerates).

Having 120fps-200fps interval on games like Overwatch, Quake Champions, Dirty Bomb, Rainbow 6, Destiny 2 or Warframe, at medium/high settings 1080p, doesn´t require an unrealistic amount of juice! Requires a very good cpu for gaming yes! Something like the 8600k at 4,8ghz will do! Something Ryzen can´t deliver, otherwise I would have one! Simple as that. You need to understand everyone has different needs. I love my 240hz setup and I don´t want a CPU that will keep my minimum fps at 85 or 90, as you can see, for example, on today Steve BF V multiplayer analysis with Ryzen vs Intel highlighted.

Now you can argue that Intel CPUs are so expensive right now that it is madness to buy them instead of a Ryzen and I agree! 100%! But 1 year ago that wasn´t the case.

One thing I can warranty you, 240hz is not placebo and it really improves your experience on every game that is not a 2d platformer or maybe RTS! The persistence is valid even if you can´t reach close to 240fps, you don´t need to. When you move your mouse realatively fast even on a 3rd person RPG, having 240hz 1080p will provide better clarity and image quality while on movement even compared to 4k 60hz! Because there will be no distortion/smearing/ghosting. You should try one day and then you agree with me.

And remember, 240hz monitors go as low as 260€ nowadays on European Amazon. Is not a premium price.
 
Now just make a K version with no iGPU.. *keeps dreaming*
As always, what you want already exists, its called HDET. Now, why you would want to go spend more money to lack an iGPU, while the core i5s have proven the iGPU does NOT impede overclocks is beyond me, but hey, the platform exists for people like you.

Stop whining and go buy one.
 
Last edited:
So if 9900k wasn't hot enough may I recommend "9590" for the 10 core model. Seems suitable for chip that's made hot squeezing every last bit of performance out of the current generation.

You can even reuse AMD's marketing with few minor changes.

81cbca53_AMD_E3_Centurion_5_GHz___1_-pcgh.jpeg

Considering my 8350 is 5.0 across all 8 cores...
 
As always, what you want already exists, its called HDET. Now, why you would want to go spend more money to lack an iGPU, while the core i5s have proven the iGPU does NOT impede overclocks is beyond me, but hey, the platform exists for people like you.

Stop whining and go buy one.

They're not releasing this on HEDT. HEDT doesn't have the ringbus that games love.
 
Say what? :love:

I didn't mean you don't have the hardware, I meant someone who can't even look back to 144Hz WOULD not be able to enjoy games at 30FPS on consoles. but seems like you're in-love with consoles & console games based on your previous posts.
 
I didn't mean you don't have the hardware, I meant someone who can't even look back to 144Hz WOULD not be able to enjoy games at 30FPS on consoles. but seems like you're in-love with consoles & console games based on your previous posts.

OT but I'll bite... maybe he plays different games on consoles that don't require the same refresh to enjoy?

What he's saying isn't impossible.
 
Say what? :love:

You remind of those guys that once said "human eye can´t see more than 24 fps". And also those that said 120hz was useless as no one ever needs more than 60hz and so on... I still have my 144hz LG and I do a lot of tests between it and my Dell AW2518 and the different is HUGE, to the point that I can now see the mouse pointer lagging on windows desktop. You can live on denial if you want. 240hz is objectively superior to 144hz and everone can notice it. And if you use 240hz for a long time you will not accept 144hz. Also you don´t need 240fps to take advantage of 240hz. Research on Blur Busters forum and website and read the technical analysis about the subject made by Chief.

You make 0 sense, and besides you got vayra's post totally wrong.
 
Back
Top