Friday, February 18th 2022

Intel Raptor Lake with 24 Cores and 32 Threads Demoed

When Intel announced the company's first hybrid design, codenamed Alder Lake, we expected to see more of such design philosophies in future products. During Intel's 2022 investor meeting day, the company provided insights into future developments, and a successor to Alder Lake is no different. Codenamed "Raptor Lake," it features a novel Raptor Cove P-core design that is supposed to bring significant IPC uplift from the previous generation of processors. Using Intel 7 processor node, Raptor Lake brings a similar ecosystem of features to Alder Lake, however, with improved performance across the board.

Perhaps one of the most exciting things to note about Raptor Lake is the advancement in core count, specifically the increase in E-cores. Instead of eight P-cores and eight E-cores like Alder Lake, the Raptor Lake design will retain eight P-cores and double the E-core count to 16. It was a weird decision on Intel's end; however, it surely isn't anything terrible. The total number of cores now jumps to 24, and the total number of threads reaches 32. Additionally, Raptor Lake will bring some additional overclocking improvement features and retain socket compatibility with Alder Lake motherboards. That means that, at worst, you would need to perform a BIOS update to get your previous system ready for new hardware. We assume that Intel has been working with software vendors and its engineering team to optimize core utilization for this next-generation processor, even though they have more E-cores present. Below, we can see Intel's demonstration of Raptor Lake running Blender and Adobe Premiere and the CPU core utilization.
Source: via VideoCardz
Add your own comment

153 Comments on Intel Raptor Lake with 24 Cores and 32 Threads Demoed

#51
Vayra86
john_Really? Because Intel's HEDT line was cheap before the first series of Ryzen processors from AMD, right? And I am also sure that Intel will start from 10/12c+ models, because there where never quad core HEDT models on Intel's HEDT platform, correct? Also Intel never uses CPU features for market segmentation. They where the good guys before AMD, who "screw over" people, by starting offering them up to 64 cores and 128 PCIe lanes in the HEDT line, with all features enabled from top to the bottom model.
No we have now to be thankful to Intel for blurring the lines between Atom CPUs and what we knew as Core CPUs until recently. We should be thankful to Intel because in the future we will be getting CPUs with more E cores than P cores. We shouldn't be complaining because what the hell are we doing with our desktops anyway? Why pay $500-700 for 16 P cores when we can pay the same and get 8 P and 8 E cores? Why pay today $800-$1000 for 24 P cores, when we can pay in the future the same for 8 P and 16 E cores? Why pay $1500 for 32 P cores, when we can pay the same in the future for 8 P and 24 E cores?

Intel, AMD, Nvidia, they are not football teams. Their gains are not always something to cheer for. Finding ways to maximize their profits, it's good, but not always something to make us happy. AMD will follow with Zen4c cores and we will end up paying for performance cores and only getting a percentage of performance cores in the final product.
So you dont buy it, its really quite simple that way.
Posted on Reply
#52
James7787
This hybrid architecture is the best thing Intel did in last decade. Before I started looking at raptor lake I tought Zen 4 would blow it out of water but now I see opposite is more likely. Multi threading is pretty much lock for raptor lake because AMD is unlikely to increase core count while single thread will be more interesting but I don't think AMD can catch up. And next year Zen 5 (hybrid, 3nm) vs Meteor Lake (Intel 4 and new e-cores) will be even more interesting. Basically you can't go wrong if you go either company because how competitive they have become. Great times for CPUs.
Posted on Reply
#53
Vayra86
john_Stopped reading there. Why? Because you start with an accusation and at the same time prove that you didn't bothered reading my post entirely. Read the whole post, remove your rose tinted glasses and try again.

I never bought a Bulldozer CPU, with an exception of a short period having an FM2+ APU(bought it for the integrated GPU, not the CPU part), because those Bulldozers where not real 4-6-8 core CPUs. Just marketing.
You're not being honest here. Its not about whether you bought them or not. The subject you opened yourself was 'they say these are cores like any other'. But they really dont, we speak of P+E configs. As it was back then during FX, and as it is now during ADL, you need to read what specs mean. But even with regular cores you are supposed to RTFM and do it well if you want optimal performance.

But much more importantly, we might find the value of 'lesser' cores quite substantial if or when the technology matures or when the market 'catches up' to new hardware: this was even the case with FX procs, albeit extremely late.
Posted on Reply
#54
john_
Vayra86So you dont buy it, its really quite simple that way.
When they will be the only options in the market, it wouldn't be a choice. We would be "extremely happy" in a few years with our 8+8 CPUs at $600 because the full 16+0 model will be costing $1000 and the performance difference wouldn't be making much sense.
Vayra86You're not being honest here. Its not about whether you bought them or not. The subject you opened yourself was 'they say these are cores like any other'. But they really dont, we speak of P+E configs. As it was back then during FX, and as it is now during ADL, you need to read what specs mean. But even with regular cores you are supposed to RTFM and do it well if you want optimal performance.

But much more importantly, we might find the value of 'lesser' cores quite substantial if or when the technology matures or when the market 'catches up' to new hardware: this was even the case with FX procs, albeit extremely late.
Most of the time I am trying to be honest. When I don't try to be honest, it will be easily to see, because I will be trolling just for the fun of it and in a way that, I repeat, it will be easy to see.
If I wasn't honest here, I would be cheering Intel's hybrid design, because I own both Intel and AMD shares. So anything good for the shareholder, and this hybrid design is the best thing it could happen to the shareholder, it's also good for me. I gain more as a shareholder if desktop/mobile X86 CPUs follow the hybrid design, than I lose as a consumer. But I like posting on forums as a consumer, because people here are also consumers.

I look at shops and I see "12700K, 12 core processor", not 8+4 processor. If YOU want to be honest, at least acknowledge that 90%+ of consumers out there wouldn't be able to/wouldn't care to/never ever consider asking what is a 8+4 cores hybrid design. They will just think what the shop/sales person/Intel box will tell them "12 Cores CPU". You doubt? Look at the Android phone market. NO ONE markets their phones as having a 4+4 SOC. OCTACORES they say.

As for my subject here, it was the "excitement" in the news article. You misunderstood my post, or you are not being honest here and you just distort my post. Should I repeat again your advice, to remove those "rose tinted glasses" and read it again? I said that a hybrid future where P cores get stack at 8 and E cores increase in numbers to create an illution of progress, IS NOT something that should make us feel excited. Do you understand it now or are my English so bad that I should start posting in Greek and let Google translate do the translation?

It's also convenient to throw all the blame to consumers for not RTFM or not spending enough time to be able to understand CPU specs. I am pretty sure that when you go out and buy for example a refrigerator you read reviews about it's parts to see if all the parts in the refrigerator, even the fan or the light bulb used is the best model. Or when buying a washing machine you try to find if the steal used is from country X that produces quality A steal and not from country Y that produces A- quality steal. People do better stuff with their free time than reading specs. You, me everyone.

FX processors where AMD's try on the Pentium 4 idea and also a way to negate Intel's thread advantage with HyperThreading. So they created a marketing CPU that would give them the chance to advertise high frequencies and a higher number of cores than Intel. Well, as with Pentium 4, it didn't worked as they where hopping.
Posted on Reply
#55
Manoa
the real problem with ADL is not E cores or P cores, it's high cache/memory latencies and OS scheduling decision problems, everything else would work good otherwise, I would say also power efficiency vs AMD is bad but I don't wanne compare it to AMD
Posted on Reply
#56
arni-gx
so, right now..... intel want to released a new cpu series, every year..... its very scary.....
Posted on Reply
#57
john_
arni-gxso, right now..... intel want to released a new cpu series, every year..... its very scary.....
It's versatile. We will be getting a refresh series where P cores will be the same and the whatever exciting new features will be focusing on improving E cores, for example.
Posted on Reply
#58
thestryker6
john_Really? Because Intel's HEDT line was cheap before the first series of Ryzen processors from AMD, right? And I am also sure that Intel will start from 10/12c+ models, because there where never quad core HEDT models on Intel's HEDT platform, correct? Also Intel never uses CPU features for market segmentation. They where the good guys before AMD, who "screw over" people, by starting offering them up to 64 cores and 128 PCIe lanes in the HEDT line, with all features enabled from top to the bottom model.
No we have now to be thankful to Intel for blurring the lines between Atom CPUs and what we knew as Core CPUs until recently. We should be thankful to Intel because in the future we will be getting CPUs with more E cores than P cores. We shouldn't be complaining because what the hell are we doing with our desktops anyway? Why pay $500-700 for 16 P cores when we can pay the same and get 8 P and 8 E cores? Why pay today $800-$1000 for 24 P cores, when we can pay in the future the same for 8 P and 16 E cores? Why pay $1500 for 32 P cores, when we can pay the same in the future for 8 P and 24 E cores?

Intel, AMD, Nvidia, they are not football teams. Their gains are not always something to cheer for. Finding ways to maximize their profits, it's good, but not always something to make us happy. AMD will follow with Zen4c cores and we will end up paying for performance cores and only getting a percentage of performance cores in the final product.
First of all with regards to Intel I said "ideally" because it's what I would like to happen not that it will. You cannot ignore the fact that AMD has massively increased the price of entry on HEDT over where they and Intel were.

Intel's HEDT line started in the upper $300s until 9th/10th gen when it started in the upper $500. Threadripper currently starts at $1400 and carries much higher motherboard costs (who knows what a hypothetical new Intel HEDT motherboard might cost though).

If Intel can match MT with 8p/8e/24t to AMD's 16c/32t why should anyone care what type of cores they are? You, much like everyone else complaining in this thread, didn't give a single actual reason why you're upset over the core configuration. Using your take what would you possibly use 16p lower clocked cores for (which they would have to be due to power consumption) that would be better than 8p/8e? If your argument is that if the prices keep going up without real benefit then it's bad then sure I think we'd all agree with that, but nobody knows what pricing will look like.
Posted on Reply
#59
Unregistered
Minus InfinityWhat do mean maybe. Zen 5 is confirmed as big,little design. The little cores will be Zen 4c cores at this stage and they will destroy Gracemont+++ cores. So come late 2023 early 2024 Zen 5 vs Meteor Lake will be very interesting. AMD is also possibly releasing all Zen 4c core CPU's with higher core counts for those that require huge MT performance. 4c cores will strip away cache and not have the IP uplifts of regular 4 cores, but should still be stronger than Zen 3 cores. A 24/32 core Zen 4c will be a MT beast.
so it's going to end up being a max core contest now to make the fastest chip is it, add more cores for MT scores :p
Manoathe real problem with ADL is not E cores or P cores, it's high cache/memory latencies and OS scheduling decision problems, everything else would work good otherwise, I would say also power efficiency vs AMD is bad but I don't wanne compare it to AMD
Hopefully some of these will be fixed with bios/windows updates. With even AMD seemingly going this way, windows is going to have to get better with the scheduler.
#60
john_
thestryker6First of all with regards to Intel I said "ideally" because it's what I would like to happen not that it will. You cannot ignore the fact that AMD has massively increased the price of entry on HEDT over where they and Intel were.

Intel's HEDT line started in the upper $300s until 9th/10th gen when it started in the upper $500. Threadripper currently starts at $1400 and carries much higher motherboard costs (who knows what a hypothetical new Intel HEDT motherboard might cost though).

If Intel can match MT with 8p/8e/24t to AMD's 16c/32t why should anyone care what type of cores they are? You, much like everyone else complaining in this thread, didn't give a single actual reason why you're upset over the core configuration. Using your take what would you possibly use 16p lower clocked cores for (which they would have to be due to power consumption) that would be better than 8p/8e? If your argument is that if the prices keep going up without real benefit then it's bad then sure I think we'd all agree with that, but nobody knows what pricing will look like.
By giving a 16 core processor to the mainstream market, AMD started offering HEDT models with 16 or more cores on the HEDT market. Someone needing many PCIe lanes, can still go on an earlier series of ThreadRippers, instead of investing a considerable amount of money on the latest series. It's better than Intel's complete abandoning the HEDT market where the only option is jumping to a Xeon chip. A Xeon chip that will be probably butchered in it's feature set, because of Intel's market segmentation. Intel was offering cheaper options because it was giving only 4 core CPUs to the mainstream market. And cheaper Intel options for the HEDT market where also offered with cut down features because of market segmentation.

I am not an expert in HEDT platform, but I am pretty sure that Intel offering a 4 core 7th series HEDT processor for under $300, with cut down features in memory, number of PCIe lanes, cache and probably other areas, was not met with enthusiasm. Neither was a valid option to anyone needing an HEDT processor, than those hopping to replace that 4 core HEDT joke in the future with something that will feel like a real HEDT CPU.

Motherboard costs where ALWAYS higher in EVERY Intel platform. By stating AMD CPU costs and AMD motherboard costs as negative facts and then questioning about Intel costs, isn't making your remarks more objective.

Why I am seeing a hybrid design as something negative for the consumer? If this is not understood by now, what can I do? In Alder Lake, the high IPC of P cores is used to both give the CPU the single thread crown and also cover up the performance deficiency of E cores. Intel is trying to find a way to remain competitive while being at a less dense manufacturing node with AMD, the same way AMD was trying to match Intel's advantage in manufacturing 10 years ago, by offering dual core modules instead of real dual cores that would be taking more space on die. If AMD's FX line was offering excellent IPC, today's CPUs would have been mostly dual or quad core modules. While we would be happy with performance compared to older CPU models, we would be losing, without knowing, the advantage of having full cores instead of modules in our CPUs. I really can't write a book to explain it further. I am feeling I am stating the obvious to people who don't want to listen.
Posted on Reply
#61
Unregistered
Afaic if you want HEDT you will have to pay for it, atm only AMD has any HEDT CPU's worth buying. Anyone buying threadripper is not going to whine at the price if they need it for pro use, only rich boys wanting it for epeen might.

Personally i can't wait to see what AM5 is bringing, if it is good enough i will happily ditch the ADL for it as i have absolutely no loyalty to any camp. At least my ADL is good enough for at least raptor if it is worth buying.
#62
Vayra86
john_When they will be the only options in the market, it wouldn't be a choice. We would be "extremely happy" in a few years with our 8+8 CPUs at $600 because the full 16+0 model will be costing $1000 and the performance difference wouldn't be making much sense.



Most of the time I am trying to be honest. When I don't try to be honest, it will be easily to see, because I will be trolling just for the fun of it and in a way that, I repeat, it will be easy to see.
If I wasn't honest here, I would be cheering Intel's hybrid design, because I own both Intel and AMD shares. So anything good for the shareholder, and this hybrid design is the best thing it could happen to the shareholder, it's also good for me. I gain more as a shareholder if desktop/mobile X86 CPUs follow the hybrid design, than I lose as a consumer. But I like posting on forums as a consumer, because people here are also consumers.

I look at shops and I see "12700K, 12 core processor", not 8+4 processor. If YOU want to be honest, at least acknowledge that 90%+ of consumers out there wouldn't be able to/wouldn't care to/never ever consider asking what is a 8+4 cores hybrid design. They will just think what the shop/sales person/Intel box will tell them "12 Cores CPU". You doubt? Look at the Android phone market. NO ONE markets their phones as having a 4+4 SOC. OCTACORES they say.

As for my subject here, it was the "excitement" in the news article. You misunderstood my post, or you are not being honest here and you just distort my post. Should I repeat again your advice, to remove those "rose tinted glasses" and read it again? I said that a hybrid future where P cores get stack at 8 and E cores increase in numbers to create an illution of progress, IS NOT something that should make us feel excited. Do you understand it now or are my English so bad that I should start posting in Greek and let Google translate do the translation?

It's also convenient to throw all the blame to consumers for not RTFM or not spending enough time to be able to understand CPU specs. I am pretty sure that when you go out and buy for example a refrigerator you read reviews about it's parts to see if all the parts in the refrigerator, even the fan or the light bulb used is the best model. Or when buying a washing machine you try to find if the steal used is from country X that produces quality A steal and not from country Y that produces A- quality steal. People do better stuff with their free time than reading specs. You, me everyone.

FX processors where AMD's try on the Pentium 4 idea and also a way to negate Intel's thread advantage with HyperThreading. So they created a marketing CPU that would give them the chance to advertise high frequencies and a higher number of cores than Intel. Well, as with Pentium 4, it didn't worked as they where hopping.
But that's the thing.

90% of consumers are NEVER going to understand a thing despite the absolute requirement for 'customer due diligence'. Its just straight up laziness or information overload / lack of time / attention, no amount of 'extra info' is going to fix this for people. Its up to the people themselves at some point. You can't cater to the bottom end of the (social) ladder. That's what commerce has been doing quite precisely and it doesn't help us in the slightest. They have as little understanding of the meaning of 12 cores versus 8+4, as they do about numbers of GB RAM, Megapixels, Speaker Watt RMS, etc etc etc. You will never, EVER, fix stupid. The way to do that is not better commerce, its better education - something we seem to forget a lot lately. Education as in school. Not reading 'best answer' posts on forums and google ;) Real education is aimed at making people 'capable' to teach themselves, to filter through the bullshit and distill the correct information. And to filter, to begin with.

Lacking that, in the end, what matters for consumers is the real world performance/characteristics of a product.
And in thát sense, we agree on many points don't get me wrong - maybe I misjudged your stance wrt 'lesser cores' and it being just an Intel thing. I think we both recognize it happens everywhere. I, too, only consider tangible progress, actual progress. Surely you've read my criticism elsewhere about products of either camp/color.

The point I'm trying to get across though is exactly what you called 'excitement in the news post'; it is that we're in early adopter land when it comes to ADL and the purpose of bigLittle on desktop. Some say 'just give me P cores'. Others see the benefit of the higher core count as it also frees up TDP budget for performance on single thread-oriented workloads. Newer iterations of Zen are likely to follow up on the approach too. DDR5 is going to open up additional bandwidth getting MSDT closer to HEDT performance (even if the bar moves up on that segment too, you do get access to more workstation-like capability in MSDT too) and that also opens up ways to use higher core counts for applications. Similarly, Zen also exceeds the 8 core 'limit' we long considered enough on MSDT. How is this not progress?

So the question I presented against your stance, is that maybe its a bit too early to judge 'E cores' as half functional cores or 'lesser cores' altogether. You have no way of telling right now this approach is not progress. And that's where the rose tinted glasses are: you pre empted bigLittle maturing on desktop x86 and being an actual advantage, but the performance numbers so far don't follow that argument at all. ADL performs quite well with or without them, even with a much lower TDP ceiling than that horrible PL2 241W limit. And that's in (small) part due to the presence of E-cores that increase efficiency opportunistically. The low hanging fruit in CPU land is gone. These are the baby steps we're going to be looking at apart from just 'more silicon', sadly, to make progress at all.


The other aspect you mention: commercial approach, is another aspect altogether, and in that one, we consumers do have the power. We just don't buy. If there is no progress there is absolutely no need to upgrade, so that problem fixes itself sooner rather than later. If the price for upgrading is not relative to the profit, a similar situation exists. Look at GPUs right now. You can buy them at 2-3x the normal price. But people simply won't, there are limits. Have faith in those limitations, is my message for that, and also apply them to your own situation - something you also seem to do, very wisely.
Posted on Reply
#63
Unregistered
Every new gen cpu going forward is probably going to be BIG.little so we might as well just get used to it, and hope windows can get better at controlling it. I think the idea in theory is a good one as it seems pointless running apps that don't need a power core on one. Imagine kinda having two CPU's, one to run small backround stuff nd one to run big foreground more demanding stuff. That is how i see ADL. Maybe as it is the first iteration of this type it is not perfect, but could be.
#64
john_
Vayra86But that's the thing.

90% of consumers are NEVER going to understand a thing despite the absolute requirement for 'customer due diligence'. Its just straight up laziness or information overload / lack of time / attention, no amount of 'extra info' is going to fix this for people. Its up to the people themselves at some point. You can't cater to the bottom end of the (social) ladder. That's what commerce has been doing quite precisely and it doesn't help us in the slightest. They have as little understanding of the meaning of 12 cores versus 8+4, as they do about numbers of GB RAM, Megapixels, Speaker Watt RMS, etc etc etc. You will never, EVER, fix stupid. The way to do that is not better commerce, its better education - something we seem to forget a lot lately. Education as in school. Not reading 'best answer' posts on forums and google ;) Real education is aimed at making people 'capable' to teach themselves, to filter through the bullshit and distill the correct information. And to filter, to begin with.

Lacking that, in the end, what matters for consumers is the real world performance/characteristics of a product.
And in thát sense, we agree on many points don't get me wrong - maybe I misjudged your stance wrt 'lesser cores' and it being just an Intel thing. I think we both recognize it happens everywhere. I, too, only consider tangible progress, actual progress. Surely you've read my criticism elsewhere about products of either camp/color.

The point I'm trying to get across though is exactly what you called 'excitement in the news post'; it is that we're in early adopter land when it comes to ADL and the purpose of bigLittle on desktop. Some say 'just give me P cores'. Others see the benefit of the higher core count as it also frees up TDP budget for performance on single thread-oriented workloads. Newer iterations of Zen are likely to follow up on the approach too. DDR5 is going to open up additional bandwidth getting MSDT closer to HEDT performance (even if the bar moves up on that segment too, you do get access to more workstation-like capability in MSDT too) and that also opens up ways to use higher core counts for applications. Similarly, Zen also exceeds the 8 core 'limit' we long considered enough on MSDT. How is this not progress?

So the question I presented against your stance, is that maybe its a bit too early to judge 'E cores' as half functional cores or 'lesser cores' altogether. You have no way of telling right now this approach is not progress. And that's where the rose tinted glasses are: you pre empted bigLittle maturing on desktop x86 and being an actual advantage, but the performance numbers so far don't follow that argument at all. ADL performs quite well with or without them, even with a much lower TDP ceiling than that horrible PL2 241W limit. And that's in (small) part due to the presence of E-cores that increase efficiency opportunistically. The low hanging fruit in CPU land is gone. These are the baby steps we're going to be looking at apart from just 'more silicon', sadly, to make progress at all.


The other aspect you mention: commercial approach, is another aspect altogether, and in that one, we consumers do have the power. We just don't buy. If there is no progress there is absolutely no need to upgrade, so that problem fixes itself sooner rather than later. If the price for upgrading is not relative to the profit, a similar situation exists. Look at GPUs right now. You can buy them at 2-3x the normal price. But people simply won't, there are limits. Have faith in those limitations, is my message for that, and also apply them to your own situation - something you also seem to do, very wisely.
What you describe about others, laziness, bottom end of the (social) ladder, little understanding, stupid, (lack of) better education, people (not) capable, (not being able) to filter through the bullshit and distill the correct information, applies to everyone, ME AND YOU. You might understand 8+4 vs 12 cores, I might also understand 8+4 vs 12 cores, but there are so many things out there about EVERY TYPE of products that we don't. From a car, to a PC, to a refrigerator, to a TV, to a pair of speakers, to clothes, to even the food we buy. We don't have 10 lives to learn about everything and so, yes, as you are saying, that's how commerce works. Or else everyone would be buying only one model from each product category.

I do see your points in the rest of your post, but I do not agree about Alder Lake. Alder Lake seems to be progress when running at 5+GHz with spikes of over 200W power consumption and compared to a 1.5 years old CPU. I don't see progress in that. I only see an opportunity for companies to start selling E cores for the price of P cores in the future and the mainstream platform going from 16 P cores, back to 8-12 max P cores. Intel already done it, going from 10 P cores to 8 P cores, AMD will just follow latter.

We consumers don't have a choice. That's a fantasy. We get an option when a company needs us. AMD needed us, so they gave us HEDT CPUs, like the Ryzen 7 1700, on a mainstream platform. Intel might need us to beta check their first series of discrete GPUs, they might offer us some GPUs at really good value. In the majority of cases, companies will avoid giving us a very good alternative, just to avoid price war with a bigger competitor. As for no progress = no reason to upgrade, excuse me, but Intel was selling 4 cores for a decade. Never seen their sales going down. The GPU example is flawed. A valid GPU example is for example the RX 480 that was followed by RX 580, got replaced with RX 5500 and then RX 6500. Not just lack of progress, but even degradation.
Posted on Reply
#65
Unregistered
So how many P cores do you want? Do normal consumers really need more than 8/16HT cores? I don't really think so. Prosumers will just buy PRO CPU's if they need them. Most normal users mostly game with a bit of epeen benching, so imo don't really need more than 8/16HT, maybe i am wrong.
Posted on Edit | Reply
#66
john_
10 years ago the question was "Who needs more than a 4 core / 8 threads i7? It can do about everything a normal consumer wants.".
Posted on Reply
#67
TheoneandonlyMrK
john_10 years ago the question was "Who needs more than a 4 core / 8 threads i7? It can do about everything a normal consumer wants.".
There's a lot of not normal people about I would then agree , I can admit I am one.
Hence 3 PC's and rising.
Posted on Reply
#68
Unregistered
Isn't it the same now only 8/16ht? People just think they need 16+ core CPU's for home use but wouldn't even use it fully.
Posted on Edit | Reply
#69
Vayra86
john_10 years ago the question was "Who needs more than a 4 core / 8 threads i7? It can do about everything a normal consumer wants.".
And frankly, it still applies...

We as consumers might need to reflect a little bit on our own behaviour in that sense. We love to be misled, it seems, and we love to think there is always something better around the corner.

You made some great points, but Im also seeing a constant misplaced drive to keep buying. Not out of necessity, but 'because we can'.

It is our excessive consumerism that created what we have. Mountains of e-waste. If we cannot collectively change that, we wont change the nonsensical 'upgrades' and we open the door for profit maximizing for companies.

Note all of these consumer traits are researched things. We are being played. And we build a narrative for ourselves to tell each other that companies are to blame, so we can stop looking in the mirror and face truth.

As for ADL... Im not convinced yet myself and wont buy into it, even though I also see the CPUs work fine at similar TDP limits to Zen. But I simply dont have workloads to go up yet in core count. Thats a clear no-buy situation to me. The only moment I upgrade is when I see the performance is required across the board. And Im seriously questioning whether that isnt the exact same for almost everyone on this forum, yes even those who might get a 10% advantage from the connected GPUs.

But when you get years of baby steps, 10% in our heads might create that silly narrative for us that we call 'upgrade itch' - that nagging feeling of money burning a hole in your pocket for no reason at all but emotion.
TiggerSo how many P cores do you want? Do normal consumers really need more than 8/16HT cores? I don't really think so. Prosumers will just buy PRO CPU's if they need them. Most normal users mostly game with a bit of epeen benching, so imo don't really need more than 8/16HT, maybe i am wrong.
Exactly this. Things have limits. Even if we think otherwise.
Posted on Reply
#70
ThrashZone
CrackongFor the same die area.
I would prefer 14+0 instead of 8+16 as a desktop CPU
Hi,
Can't count defective cores as real cores though they'd get backlash
14 cores 28 threads "wow I have one of those with a terrible mesh/ cache design" but these 12 series are where you have only 8 decent oc cores and 6 crappy ones
Only perks are awesome cache oc and ddr5 if you want to count that too

Intel just came out with a gimmick to use cores they used to disable with binning or toss because they were space heaters on the same voltage as good cores.
But they make okay limited threads.
Posted on Reply
#71
john_
TiggerIsn't it the same now only 8/16ht? People just think they need 16+ core CPU's for home use but wouldn't even use it fully.
One of my systems is a 6 core Thuban (a quad core Athlon GR unlocked fully to Thuban) with a good SATA SSD. For typical desktop usage a casual user wouldn't really see any difference compared to my Ryzen 2600X with the OS on a fast PCIE 3.0 NVMe SSD. Obviously the old AM3 CPU and platform on a SATA SSD will be slower compared to the AM4 + NVMe setup, but not that much slower to annoy the user. The casual user AND me will get used to the speed that this AM3 setup gets the job done and never complain as long as the system is smooth without hicups and programs open at a short time. There where two cases where I had somewhat bad experience in the past. On an i5 4460 when using it's integrated graphics, where the desktop experience was not smooth at all, especially when trying to load a dozen tabs, something that I was doing everyday when searching for news(with a low end discrete GPU it was smooth) and when running an Athlon 3000G, that was having hard times playing for example a YouTube video while doing a virus check on my NVMe drive(so no dealy from storage). Benchmarks where showing both the i5 and the 3000G faster, but the experience was saying something different.

What I mean is that this is NOT a case of "People just think they need 16+ core CPU's for home use but wouldn't even use it fully". If this was the case, an Android tablet will do most jobs nicely. A 10 years old quad core system with an SSD would be doing most jobs nicely also. No one looks at Apple's M1 and says "Too few cores".
But we are not talking about what the average Joe needs. We are talking about progress and from the beginning, my whole argument is that a Hybrid design is not something to make us feel excited. It's a necessity for Intel to keep up in this core count war. Until Ryzen 3000, Intel was fine. They had the IPC advantage and 8-10 faster cores where some times better than 16 fast cores. They lost that advantage with Ryzen 5000 series and they can't be certain that they will not lose it again with Zen 4. So, they gone the hybrid way to at least keep up with core count, without having to build a line of massive chips that need 300W of power to keep up. It's an excellent strategy from Intel, not something to make us consumers "excited", especially when it is already obvious that the excited thing of the future is "more E cores".
Vayra86And frankly, it still applies...

We as consumers might need to reflect a little bit on our own behaviour in that sense. We love to be misled, it seems, and we love to think there is always something better around the corner.

You made some great points, but Im also seeing a constant misplaced drive to keep buying. Not out of necessity, but 'because we can'.

It is our excessive consumerism that created what we have. Mountains of e-waste. If we cannot collectively change that, we wont change the nonsensical 'upgrades' and we open the door for profit maximizing for companies.

Note all of these consumer traits are researched things. We are being played. And we build a narrative for ourselves to tell each other that companies are to blame, so we can stop looking in the mirror and face truth.

As for ADL... Im not convinced yet myself and wont buy into it, even though I also see the CPUs work fine at similar TDP limits to Zen. But I simply dont have workloads to go up yet in core count. Thats a clear no-buy situation to me. The only moment I upgrade is when I see the performance is required across the board. And Im seriously questioning whether that isnt the exact same for almost everyone on this forum, yes even those who might get a 10% advantage from the connected GPUs.

But when you get years of baby steps, 10% in our heads might create that silly narrative for us that we call 'upgrade itch' - that nagging feeling of money burning a hole in your pocket for no reason at all but emotion.
I believe I was answering to you while you where posting.
Vayra86Exactly this. Things have limits. Even if we think otherwise.
That's where Intel's strategy is based. "How many cores do consumers need? 8 max. Why? Because most software can't/doesn't need to use more and modern consoles come with 8c/16t CPUs. What do we do to keep up with AMD? We throw little cores in the mixture. We write on the box 8+8 cores to avoid negative press and lawsuits, but we know that they will be sold to the average consumer as 16 core CPUs."

The question is. "We are being played." as you say. Are we going to not just accept that fact, but also support that? Like it? Even argue in favor of the company that tries to play as the most?

I am really curious about the 6500 XT. If I spent time in here to read comments about this card, am I going to read, from the same people who say that this hybrid design is good, positive comments about the 6500 XT? I could think a few.
4GB RAM, bad for miners, positive for gamers.
4GB RAM. Who needs more in a low end card? It keeps price down. Better than having 8GB and being more expensive.
A mobile chip. Good job AMD offering us this solution. Better than offering nothing.
Who cares about PCIe 3.0 performance because of that PCIe x4 limitation? People should move to PCIe 4.0 anyway. It probably cuts the cost of the card also. So it's good!
64 bit data bus? Bad for miners, so it is good. Who needs an 128bit data bus? We gamers have that huge amount of 16MB Infinity cache anyway.
There are probably more stuff I could think of, but better stop here. I don't like writing stuff I do not believe.
Posted on Reply
#72
ghazi
As a committed AMDumb for over two decades and counting I find the reaction to this news from the AMD fans a bit strange. Yes, it is an incremental update, and we would all prefer to have more P cores. But look at what you're saying, complaining about Intel adding "moar coars".

Yes, the E-cores are slower, and yes, it was a great thing that AMD ""democratized"" higher core counts. But the E-core focus isn't hampering your ST performance. One user said something along the lines of, "why can't we have 32 P cores". You could have bought a 3970X with 32 cores two years ago, but you didn't, why? Unless you do 3D modeling for a living or some other high-power workstation task, the MSDT CPUs were the same if not better for you in terms of real performance. To a much greater degree these P cores are infernos and you're not stitching a ton of them together without paying a price.

Intel's big.LITTLE, for all its flaws and issues, solves one problem really well, in that it allows Intel to have leadership in ST and MT at the same time. The 8 P cores are faster than AMD's fast cores and none of us "enthusiasts" really care to have more, as far as the workloads that are highly sensitive to ST perf go. Some do need more MT performance, but when you're talking a program using 16-32+ cores effectively, what difference does it make to you how fast the cores are, so long as summed up they're faster than the competition's CPU? You get something better all around from games to Blender and then you complain because the cores aren't the same. Granted, I am looking forward to Zen 4 and hoping AMD can regain ST leadership, in which case Intel banking on its 16 extra weak cores making up for an ST deficit wouldn't be a winning proposition when AMD already gives you 16 big cores to play with and no weird issues with compatibility etc.

Now, I still think big.LITTLE is far from perfect, and it's more beneficial to Intel than to AMD at the moment, since Intel is still using monolithic designs where area is a more major concern, and the E-cores do really well there; whereas conversely on power consumption their benefit is very much debatable, and AMD is mitigating the cost of increased silicon area through the tiny CCDs. Just my two cents, probably obvious to many but people will always argue.
Posted on Reply
#73
Unregistered
In some things ADL is more than keeping up with Ryzen 5000. If they had given it more than 8 cores, it would have thrashed it, abeit at a higher max wattage. ADL use less power than Ryzen for gaming and this has been shown, so for gaming a 10 or 12 core ADL would have been a big hit.

I don't think they did it to keep up with Ryzen, I think it was to test BIG.little. They have shown their future chips are going to be this, and i am sure they would not have done this just to keep up. imo ADL was a test of it. All next gen CPU's will be BIG.little even AMD's so maybe Intel have got the jump on them by testing it with ADL?

You think Intel manufactured ADL just to keep up with the core count? Maybe that is what they are also doing with Raptor lake and Meteor lake too? or just maybe ADL was a test of BIG.little and they have decided it is a good idea.

As already shown, Intel is going to be using tiles instead of monolithic which will give them much better options as regard P core number etc. They can just add more tiles as AMD do with their chiplets for more power cores.
Posted on Edit | Reply
#74
Max(IT)
CrackongFor the same die area.
I would prefer 14+0 instead of 8+16 as a desktop CPU
Again, that’s because you don’t know what you are speaking about.
Posted on Reply
#75
Vayra86
john_I believe I was answering to you while you where posting.

That's where Intel's strategy is based. "How many cores do consumers need? 8 max. Why? Because most software can't/doesn't need to use more and modern consoles come with 8c/16t CPUs. What do we do to keep up with AMD? We throw little cores in the mixture. We write on the box 8+8 cores to avoid negative press and lawsuits, but we know that they will be sold to the average consumer as 16 core CPUs."

The question is. "We are being played." as you say. Are we going to not just accept that fact, but also support that? Like it? Even argue in favor of the company that tries to play as the most?

I am really curious about the 6500 XT. If I spent time in here to read comments about this card, am I going to read, from the same people who say that this hybrid design is good, positive comments about the 6500 XT? I could think a few.
4GB RAM, bad for miners, positive for gamers.
4GB RAM. Who needs more in a low end card? It keeps price down. Better than having 8GB and being more expensive.
A mobile chip. Good job AMD offering us this solution. Better than offering nothing.
Who cares about PCIe 3.0 performance because of that PCIe x4 limitation? People should move to PCIe 4.0 anyway. It probably cuts the cost of the card also. So it's good!
64 bit data bus? Bad for miners, so it is good. Who needs an 128bit data bus? We gamers have that huge amount of 16MB Infinity cache anyway.
There are probably more stuff I could think of, but better stop here. I don't like writing stuff I do not believe.
This is something of all ages and I really fail to see any difference between Intel, AMD, Nvidia or Facebook in that sense... These are corporations and they thrive on influence and on money. Cognitive dissonance, its not unique to this market either. The core principle at work here is 'I must buy, I must upgrade, because that's what I used to do, its what we're supposed to do'. Companies play on this. From birth, we are fed a narrative that consuming is good. It keeps the economy going and keeps people working which means we can live more comfortably. That narrative is pushed every day from every possible angle. Even in the face of a global pandemic where the risk of long term physical damage and even death (!) is major, we/some propose to keep working to keep the economy going. Even the non-essential parts of it. Bread and Games, we're still those idiot plebs sitting in the Colosseum watching gore and blood, except now its digital, sometimes. Such civilization.

Its not even about acceptance of the fact, is it? Its about us just not knowing any better. We must buy. We shall buy. So what if the card has 4GB. So what if the CPU has cores that hardly do a thing in practice. Money must roll, we worked hard for it, we want our dopamine shot.

Reflect. That is what I'm saying. When you start going into the detail of P/E cores and whether or not those are good or not, we've already gone way too deep down the rabbit hole. The real question is, do you need them? The marketing is of a similar nature: when we're down to that level of detail, are we not skipping past far more important indicators of performance? You said it right when you pointed it out its impossible to know everything about everything.

As for these 4GB cards and hardly progressing CPUs, they exist for exactly that reason: buy buy buy. You can upgrade your card again to a newer generation. Wooptiedoo! Finally we can buy a GPU. Only idiots, and those who have (or see) no other options, downgrade to upgrade, let's be honest here. The specs are irrelevant for that group. Anyone else with a lick of sense will skip past something that will never hold value past this dank pit of scarce product, which will eventually end as all things do.

Markets usually self-correct. Its a mistake to use a limited scope on the market to determine what's what which is my main (and only) concern with your stance on ADL right now. You could be right in the end, but you're likely not. That applies to both ADL and this 6500XT example. Isn't this all a big pile of 'who the hell cares, and why would we even care'? You just skip the product, and in doing so, you voice the fact you're not interested in it. That's how the market works. Similarly wrt to those Intel quads we've had for ten years... apparently none of these consumers felt any need to buy into Extreme CPUs that offered 6 cores or more. That literally spells that there is no market for it in regular consumer space. Fast forward to 2022 and I dare say Intel had a very good view of the demand on the market at the time, and made sound business decisions up to and including Skylake. The kicker here is, that people (gamers! consumer segment!) ARE in fact spending over 1K on board and CPU right now. That's the price point Extreme was at...

Why would there suddenly be a market now for infinitely scaling core counts? That's literally the salesman's pitch: creating a demand where there is none. Like I pointed out earlier: we are fooling ourselves with supposed demand that is not 'necessity'. And depending on the glasses you wear, you could quite simply say the same of AMD, who was first in pushing the core count 'war' that is remarkably similar to a Megapixel race, and all the other 'look at my numbers' races between companies. A neutral pair of glasses IMHO should value all of these activities the exact same: companies that want to move product, and silly customers buying the fantasy.
Posted on Reply
Add your own comment
Apr 30th, 2025 07:09 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts