• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Is Intel going to deliver a processor/chipset worth waiting for?

Benefits compared to intel ? Dafuq are you on about - most backwards logics ive ever heard. If we are going to talk about whether or not vcache benefits games, we are looking at how the equivelant amd cpu without vcache does compared to the one with vcache... aka 7700x vs 7800x3d. And it essentially always helps substantially. How the 7800x3d performs in a game compared to 14900k is entirely irrevelant in regards to whether or not vcache helps, which was your claim that it only does in a few games, and which is obviously pulled right out of your arse.
Sure, technically you are correct, not arguing that. But you asked about the intel comparison, so in that context a game benefiting from vcache is irrelevant if it still loses to the comparable Intel cpu. There are only a handful of games that the 3d cache is leading Intel by a healthy margin.

As for your "points" in your previous post - if MT workload is your purpose with a workstation, then you get a 7950x or threadripper, and beat anything intel has to offer. Simple as that.
Why? That's just silly. So if I wanted a fast car i'd just buy a buggati, there is nothing in between. Jesus christ man, what you are saying makes 0 sense.

Also the 7950x doesn't beat anything Intel has to offer in the first place. Easy example, cinema4d, I can't even begin to fathom how much power a 7950x would need to even reach the 14900k score. According to this very (reputable as you called it) site, both the 14900k and the 13900k are the fastest desktop CPUs in existence. You are making false claims and then you call me a fanboy....ohkay buddy.


Funny how non of the reputable tech sites, like the very one we are on, have anywhere near the same findings. Zen 4 uses about 25w at idle, and while that's more than intel, it's an absolutely meaningless difference - the difference in power cost would be a few dollars per year at most.
40w for doing nothing might be meaningless to you, it's meaningful to me. Especially when - again - not only will the price equivalent amd cpu draw 40 more watts doing NOTHING, it would be much slower for both MT and ST tasks. What would ever be the point of buying the AMD part? Say an i5 13600k vs an R7 7700x, the 13600k is both faster and more efficient at MT workloads and consumes way less for semi idle workloads, why would I ever buy the 7700x?

I mean even AMD knows this, that's why their R7 is price matching an i5 and not an i7. Cause then it would be even more one sided.

Also this site doesn't really show different results. They tested the 14900k at various limits and lo and behond, at 200w it's more efficient than the 7950x, at 125w it's more efficient than the 7950x 3d. Their finding are pretty similar to the french review.
 
Last edited:
Imo Intel will be back, they have 2 of the best CPU manufacture processes now BPD been one of them, and are not beholden to TSMC for manufacture like AMD are. 24/25 is going to be an interesting time for CPU's
 
Sure, technically you are correct, not arguing that. But you asked about the intel comparison, so in that context a game benefiting from vcache is irrelevant if it still loses to the comparable Intel cpu. There are only a handful of games that the 3d cache is leading Intel by a healthy margin.


Why? That's just silly. So if I wanted a fast car i'd just buy a buggati, there is nothing in between. Jesus christ man, what you are saying makes 0 sense.

Also the 7950x doesn't beat anything Intel has to offer in the first place. Easy example, cinema4d, I can't even begin to fathom how much power a 7950x would need to even reach the 14900k score. According to this very (reputable as you called it) site, both the 14900k and the 13900k are the fastest desktop CPUs in existence. You are making false claims and then you call me a fanboy....ohkay buddy.



40w for doing nothing might be meaningless to you, it's meaningful to me. Especially when - again - not only will the price equivalent amd cpu draw 40 more watts doing NOTHING, it would be much slower for both MT and ST tasks. What would ever be the point of buying the AMD part? Say an i5 13600k vs an R7 7700x, the 13600k is both faster and more efficient at MT workloads and consumes way less for semi idle workloads, why would I ever buy the 7700x?

I mean even AMD knows this, that's why their R7 is price matching an i5 and not an i7. Cause then it would be even more one sided.

Also this site doesn't really show different results. They tested the 14900k at various limits and lo and behond, at 200w it's more efficient than the 7950x, at 125w it's more efficient than the 7950x 3d. Their finding are pretty similar to the french review.

The 7800x3d doesn't have a massive lead over 14900k in gaming performance, no, but when you factor in the price of both cpu (14900k is alot more expensive), and ram needed to make the 14900k competitive (which are alot more expensive than the 6000 mt kits needed for the 7800x3d), aswell as the ABSURD power draw of the 14900k under full load, and it becomes crystal clear in regards to what is the obvious choice for gaming cpu's. And basically everyone except for the most hardcore intel fans are in agreement on this.

If MT performance is what is important to you, then you will obviously go for the product that does that job the best. No one who has a great need for MT performance (which in 99% of cases will be companies) is going to look at a 7600x and 14600k and say "oh wauw, the 14600k has so much better MT performance, let's get that !" - no, they are obviously going to be looking at a product tailored at MT loads.

As for the 14900k vs 7950x it varies based on application, and as a private customer the 14900k might make more sense to you in that case. But if you're a company that needs MT performance, then you aren't looking at those, you are looking at threadripper, and intel doesn't really have an answer to those - only xeon, which aint even in the same ball park price wise.

That you keep saying 40w idle over and over doesn't make it true.

SFhFooV.jpg
 
I wrote this a long time ago, in Comet Lake times. :D

1709754688885.png


Ultra 100 is laptop only, just like like Core 1 was, so hopefully something exciting comes to desktop after that, just like Core 2 did. The question is when.. :D

I don't expect the U200 series to be worth waiting for just like that, but maybe the U300 will be.
It wasn't the first Core i that became legendary after all, it was the second!

If I was a die hard Intel fän I'd definitely wait for U200 tho, so refreshing with something more than a refresh!
 
Last edited:
Imo Intel will be back, they have 2 of the best CPU manufacture processes now BPD been one of them, and are not beholden to TSMC for manufacture like AMD are. 24/25 is going to be an interesting time for CPU's

Interesting but restrained performance wise.

Following on from the discussion here I'm just trying to decide on a mobo and extent to which D4/D5 RAM should be the deciding factor. No point in waiting for anything cpu related.
 
  • Like
Reactions: SL2
Sure, technically you are correct, not arguing that. But you asked about the intel comparison, so in that context a game benefiting from vcache is irrelevant if it still loses to the comparable Intel cpu. There are only a handful of games that the 3d cache is leading Intel by a healthy margin.


Why? That's just silly. So if I wanted a fast car i'd just buy a buggati, there is nothing in between. Jesus christ man, what you are saying makes 0 sense.

Also the 7950x doesn't beat anything Intel has to offer in the first place. Easy example, cinema4d, I can't even begin to fathom how much power a 7950x would need to even reach the 14900k score. According to this very (reputable as you called it) site, both the 14900k and the 13900k are the fastest desktop CPUs in existence. You are making false claims and then you call me a fanboy....ohkay buddy.



40w for doing nothing might be meaningless to you, it's meaningful to me. Especially when - again - not only will the price equivalent amd cpu draw 40 more watts doing NOTHING, it would be much slower for both MT and ST tasks. What would ever be the point of buying the AMD part? Say an i5 13600k vs an R7 7700x, the 13600k is both faster and more efficient at MT workloads and consumes way less for semi idle workloads, why would I ever buy the 7700x?

I mean even AMD knows this, that's why their R7 is price matching an i5 and not an i7. Cause then it would be even more one sided.

Also this site doesn't really show different results. They tested the 14900k at various limits and lo and behond, at 200w it's more efficient than the 7950x, at 125w it's more efficient than the 7950x 3d. Their finding are pretty similar to the french review.

Really ? It seems to me that in the real world, in some cases even the anemic 8700g beats the i9 14900k, even using all its glorious 300w of TDP it ends up under the zen4's feet.
 
The 7800x3d doesn't have a massive lead over 14900k in gaming performance, no, but when you factor in the price of both cpu (14900k is alot more expensive), and ram needed to make the 14900k competitive (which are alot more expensive than the 6000 mt kits needed for the 7800x3d)
Forget about the 14900k. Does it have a massive lead over the 14700k? The 13700k? Heck even the 13600k? What's the lead, single digits at 1080p with a 4090?
aswell as the ABSURD power draw of the 14900k under full load, and it becomes crystal clear in regards to what is the obvious choice for gaming cpu's. And basically everyone except for the most hardcore intel fans are in agreement on this.
What does the power draw under load have to do with gaming? When the 14900k draws 300w it's also more than twice as fast as the 7800x 3d. Heck, the 14900k restricted to 100w is leapfrogging the 7800x 3d in multithreaded. So why even bring that up?

If MT performance is what is important to you, then you will obviously go for the product that does that job the best. No one who has a great need for MT performance (which in 99% of cases will be companies) is going to look at a 7600x and 14600k and say "oh wauw, the 14600k has so much better MT performance, let's get that !" - no, they are obviously going to be looking at a product tailored at MT loads.
That's just absurd. Just because someone values MT performance doesn't mean he will go for a 5000$ threadripper. That's why companies make lots of products for every budget. If it was the case then amd would only make the 7600x for people that don't care about MT and then the 7950x with nothing in between.
That you keep saying 40w idle over and over doesn't make it true.
Im not the one saying it, the reviewer measured it from the wall. Power reporting on AMD cpus is done through the motherboard (SVID interface) and it's highly inaccurate. Still, even the 25-28w you are showing is absurd don't you think? Im playing Dirt 2 at 4k 120 fps Ultra on a 12900k pulling less watts than your CPU is doing at idle. I don't know man, i can't wrap my head around that.


Really ? It seems to me that in the real world, in some cases even the anemic 8700g beats the i9 14900k, even using all its glorious 300w of TDP it ends up under the zen4's feet.
You realize there were graphics benchmarks in there right? The 8000g series has a decent iGPU so yeah, obviously?
 
Last edited:
Forget about the 14900k. Does it have a massive lead over the 14700k? The 13700k? Heck even the 13600k? What's the lead, single digits at 1080p with a 4090?

What does the power draw under load have to do with gaming? When the 14900k draws 300w it's also more than twice as fast as the 7800x 3d. Heck, the 14900k restricted to 100w is leapfrogging the 7800x 3d in multithreaded. So why even bring that up?


That's just absurd. Just because someone values MT performance doesn't mean he will go for a 5000$ threadripper. That's why companies make lots of products for every budget. If it was the case then amd would only make the 7600x for people that don't care about MT and then the 7950x with nothing in between.

Im not the one saying it, the reviewer measured it from the wall. Power reporting on AMD cpus is done through the motherboard (SVID interface) and it's highly inaccurate. Still, even the 25-28w you are showing is absurd don't you think? Im playing Dirt 2 at 4k 120 fps Ultra on a 12900k pulling less watts than your CPU is doing at idle. I don't know man, i can't wrap my head around that.


You realize there were graphics benchmarks in there right? The 8000g series has a decent iGPU so yeah, obviously?
No, not really. Tip: "Device: CPU", Read again carefully, don't be so eager to defend Intel's super refresh, 300w edition.
 
Intel even gave the last three "generations" the same product code, just like with 8700K and 9900K. :confused: I only expected that for 13 and 14.

1709758476079.png
 
No, not really. Tip: "Device: CPU", Read again carefully, don't be so eager to defend Intel's super refresh, 300w edition.
The first whole page of benchmarks were gpu benchhmarks and the G was topping the entire chart, the 7950x included. What are you even suggesting? That the 8700g is faster than the 14900k? Okay bro.
 
The first whole page of benchmarks were gpu benchhmarks and the G was topping the entire chart, the 7950x included. What are you even suggesting? That the 8700g is faster than the 14900k? Okay bro.
You inadvertently ignored the Benchmarks page I linked to and jumped to the first page to pretend you didn't see the 8700g, even if limited to 45w beating the i9 with 240-300w.

Yes, the 8700g beats the i9 in several situations. Ouch.
 
Forget about the 14900k. Does it have a massive lead over the 14700k? The 13700k? Heck even the 13600k? What's the lead, single digits at 1080p with a 4090?

What does the power draw under load have to do with gaming? When the 14900k draws 300w it's also more than twice as fast as the 7800x 3d. Heck, the 14900k restricted to 100w is leapfrogging the 7800x 3d in multithreaded. So why even bring that up?


That's just absurd. Just because someone values MT performance doesn't mean he will go for a 5000$ threadripper. That's why companies make lots of products for every budget. If it was the case then amd would only make the 7600x for people that don't care about MT and then the 7950x with nothing in between.

Im not the one saying it, the reviewer measured it from the wall. Power reporting on AMD cpus is done through the motherboard (SVID interface) and it's highly inaccurate. Still, even the 25-28w you are showing is absurd don't you think? Im playing Dirt 2 at 4k 120 fps Ultra on a 12900k pulling less watts than your CPU is doing at idle. I don't know man, i can't wrap my head around that.


You realize there were graphics benchmarks in there right? The 8000g series has a decent iGPU so yeah, obviously?

It is roughly 10% faster on average than the 14700k, while also being cheaper (even with the 14700k being discounted) and using less than half the power on average in games ... and you would still need more expensive ram for the 14700k to be competitive.



The 14900k uses ALOT more power than the 7800x3d in gaming aswell - quit the intel fanboi BS. You can't wrap your head around a cpu using 25w ? How about your precious intel cpu averaging 144w in games...

power-games.png


Someone who values MT performance... who are we talking about? Someone from the tiny pc enthusiast segment ? Cause the vast majority of pc users don't give a flying fack about it - they either use the pc for office work from home or gaming. Pc enthusiasts who care about MT performance accounts for a tiny fraction of pc users... so we are talking about a tiny portion of people who want MT performance, but don't want to pay for it. Sounds like the kinda scenario only an intel fanboi could come up with to highlight intels only current strength. The average consumer buys hardware according to recommendation, and overall performance, and don't need nor use the MT performance. Companies, who are the ones that actually need the MT performance, don't buy midrange cpu's for the task.
 
You inadvertently ignored the Benchmarks page I linked to and jumped to the first page to pretend you didn't see the 8700g, even if limited to 45w beating the i9 with 240-300w.

Yes, the 8700g beats the i9 in several situations. Ouch.
The first benchmark you linked has the 14900k beating everything (by a margin) while consuming less power than the 7950x. Expected, but still.

In the benchmarks that the 8700g is beating the 14900k, it's also beating the 7950x. But I guess that's not ouch. Sure.

For example

1709759580975.png

The 14600k is beating the 7950x and the 8600g beats all of those. Does that make sense to you?
 
Last edited:
Interesting but restrained performance wise.

Following on from the discussion here I'm just trying to decide on a mobo and extent to which D4/D5 RAM should be the deciding factor. No point in waiting for anything cpu related.

I have been sat on my 12700k happily since Nov 21, i am happy to wait. AMD are just not reliable enough for me, might be quicker in gaming, but factor in my zero bsods or crashes since i built it, which is more important to me. I am in no rush to upgrade. How long till my setup is really outdated and useless? still a few years in it yet. I will see what they have later this year or next, and then decide if i am going to stick to Intel or switch to AMD and that depends on reliability.
 
It is roughly 10% faster on average than the 14700k, while also being cheaper (even with the 14700k being discounted) and using less than half the power on average in games ... and you would still need more expensive ram for the 14700k to be competitive.


Even with a 4090 at 1080p the 7800x 3d is barely 10% faster than the 14600k. Review from this very site. In normal resolutions - even with a 4090 - they are identical, while the 7800x 3d is a lot more expensive and is slower in MT and ST workloads. Great deal, lol

The 14900k uses ALOT more power than the 7800x3d in gaming aswell - quit the intel fanboi BS. You can't wrap your head around a cpu using 25w ? How about your precious intel cpu averaging 144w in games...
Out of the box, sure. It can actually hit 200w in TLOU. It's obvious that if you care about efficiency in gaming, you either don't buy a 14900k or don't run it out of the box. I've posted some results on the other thread at 120w, im pretty confident even at that wattage it's faster than your 3d but go ahead and prove me wrong.

Someone who values MT performance... who are we talking about? Someone from the tiny pc enthusiast segment ? Cause the vast majority of pc users don't give a flying fack about it - they either use the pc for office work from home or gaming. Pc enthusiasts who care about MT performance accounts for a tiny fraction of pc users... so we are talking about a tiny portion of people who want MT performance, but don't want to pay for it. Sounds like the kinda scenario only an intel fanboi could come up with to highlight intels only current strength. The average consumer buys hardware according to recommendation, and overall performance, and don't need nor use the MT performance. Companies, who are the ones that actually need the MT performance, don't buy midrange cpu's for the task.
The vast majority of PC users don't give a flying duck about a 400+$ gaming CPU and which one is faster at 720p with a 4090. They don't have and never will have a 4090. So obviously this whole debate is about that small pc enthusiast segment. For everyone else it's irrelevant in any case.

I asked you something very simple before. 13600k (i5 btw) vs 7700x (R7 btw). Similar price, the 13600k is basically beating the 7700x in every metric, from idle / light loads efficiency to multithreaded performance and efficiency and they perform similarly in games. You said there is no segment that Intel is the better choice, and yet I can't see any reason why I'd buy the 7700x over the 13600k.
 
There is also the budget range like my 12100F, when I bought this in 2022 february AMD had literally nothing to offer in this price range at the time with comparable single thread/IPC performance wich I needed for the games I was playing/planning to play at the time. 'my 1600x was struggling with those and my mobo wasn't getting the BIOS update for Zen 3 so I've sold the entire platform and switched'
Even today I would be hard pressed to find anything in that range from AMD since they kind of left that market for whatever reason. 'I've used to be an AMD budget user mainly, I've only owned 2 Intel CPUs before this one..'

Even if I do upgrade it wont be anything higher than a 13400/13500 which will last me years.
 
My last socket 1700 upgrade will probably be an i7 14700K, but as for now not yet needed.
A new GPU for GTA 6 in 2026 will be a first major upgrade probably...
 
The last time I "waited" for an Intel chip was when I was still on Socket 7 and waiting for all the moving parts in the works to settle long enough for my first build on Socket 478.

Pentium MMX 233MHz -> Pentium 4 3.00E
That was 20 years ago.

5 years later Phenom II X4 happened because I wanted 64-bit.
9 years later FX-8370 happened because I wanted VR experience.
A year later I built this Ryzen 3600 box. Notice a pattern in there? Yeah I don't either. :rolleyes:

What major milestone is going to pull me into my first Intel build in over two decades?
Hyperthreading wasn't very good back then and is finally going away now that we have performance core scheduling disparity.
Thunderbolt? Didn't need it 10 years ago, don't need it now.
USB4? Snore.
PCI-E 5.0? Maybe in another 5 years but not now.
Outrageously high temporary boosting single core clocks? Nah.
Intel Gigabit? Hahahaha
Intel Wi-Fi? HAHAHAAHAHA
Intel 10GbE SFP? Seduce me. :wtf:
 
Even with a 4090 at 1080p the 7800x 3d is barely 10% faster than the 14600k. Review from this very site. In normal resolutions - even with a 4090 - they are identical, while the 7800x 3d is a lot more expensive and is slower in MT and ST workloads. Great deal, lol
You do realize that with the GPUs getting faster in the future and higher resolutions getting more CPU bound that 10% at 1080p will start translating into meaningful differences in 1440p and 4K, right? Especially in minimum FPS which is what actually matter and, if we look at individual game benches from the 14900K review, in which some games allow the 7800X3D a bigger lead than 10% over the Intel rival. People usually don’t replace their CPU each generation and one build can see multiple GPUs before that happens.
 
Even with a 4090 at 1080p the 7800x 3d is barely 10% faster than the 14600k. Review from this very site. In normal resolutions - even with a 4090 - they are identical, while the 7800x 3d is a lot more expensive and is slower in MT and ST workloads. Great deal, lol


Out of the box, sure. It can actually hit 200w in TLOU. It's obvious that if you care about efficiency in gaming, you either don't buy a 14900k or don't run it out of the box. I've posted some results on the other thread at 120w, im pretty confident even at that wattage it's faster than your 3d but go ahead and prove me wrong.


The vast majority of PC users don't give a flying duck about a 400+$ gaming CPU and which one is faster at 720p with a 4090. They don't have and never will have a 4090. So obviously this whole debate is about that small pc enthusiast segment. For everyone else it's irrelevant in any case.

I asked you something very simple before. 13600k (i5 btw) vs 7700x (R7 btw). Similar price, the 13600k is basically beating the 7700x in every metric, from idle / light loads efficiency to multithreaded performance and efficiency and they perform similarly in games. You said there is no segment that Intel is the better choice, and yet I can't see any reason why I'd buy the 7700x over the 13600k.

Are you kidding with that comparison? Someone who is looking to buy a 7800x3D is looking to buy the best gaming cpu. You always get better value with lower tier products. But even then the 14600k is hardly cheaper than the 7800x3d... only 10% without the current discount.


Faster than what in what ? And lol 120w is still twice as much as the 7800x3d uses...

By your previous description it sounded like MT performance was close to being the sole decider for cpu purchases for essentially all consumers. Reality is that it doesn't matter at all for the vast majority of consumers, thus it's a useless advantage for intel with the midrange cpus bought by consumes, courtesy of intels mobile chip design that they forced on desktop aswell with the e-waste cores - the exact reason i decided to go away from intel after 20 years of using intel.

The 7700x makes very little sense - i'd instead get the 7600x which is 5% slower than the 14600k in gaming, but almost 50% cheaper, and again doesn't need as expensive ram.

 
Are you kidding with that comparison? Someone who is looking to buy a 7800x3D is looking to buy the best gaming cpu. You always get better value with lower tier products. But even then the 14600k is hardly cheaper than the 7800x3d... only 10% without the current discount.
If you are always getting better value with lower tier products (and I agree) then why the hell are you even comparing the 3d to the 14900k? Especially when the difference in pure horsepower between the 2 is like, I don't know, vast?

Now let me use your argument, if someone is looking to buy the best gaming CPU they don't buy the x3d. They buy the 14900ks with an apex encore and 8400mhz ram. Yeah I know, it stands stupid, but that was your argument when it comes to MT, that either you go all in and buy 7950x or TR, or you buy a 7600x.

By your previous description it sounded like MT performance was close to being the sole decider for cpu purchases for essentially all consumers. Reality is that it doesn't matter at all for the vast majority of consumers, thus it's a useless advantage for intel with the midrange cpus bought by consumes, courtesy of intels mobile chip design that they forced on desktop aswell with the e-waste cores - the exact reason i decided to go away from intel after 20 years of using intel.

The 7700x makes very little sense - i'd instead get the 7600x which is 5% slower than the 14600k in gaming, but almost 50% cheaper, and again doesn't need as expensive ram.

No im not saying MT performance is the sole decider, but when everything else is pretty much identical (including the price) then....

Anyways, your original argument was that there is no segment in which Intel is the better choice, and that's just obviously wrong, quite the opposite is true, Intel is the better choice in most segments.

You do realize that with the GPUs getting faster in the future and higher resolutions getting more CPU bound that 10% at 1080p will start translating into meaningful differences in 1440p and 4K, right? Especially in minimum FPS which is what actually matter and, if we look at individual game benches from the 14900K review, in which some games allow the 7800X3D a bigger lead than 10% over the Intel rival. People usually don’t replace their CPU each generation and one build can see multiple GPUs before that happens.
For the average user, nope, not going to happen. GPUs get faster (do they really? 6800xt ---> 7800xt) but also new games need more and more GPU power. I don't see an xx70 tier GPU being bottlenecked by a 13600k in the next 4-5 years.
 
If you are always getting better value with lower tier products (and I agree) then why the hell are you even comparing the 3d to the 14900k? Especially when the difference in pure horsepower between the 2 is like, I don't know, vast?

Now let me use your argument, if someone is looking to buy the best gaming CPU they don't buy the x3d. They buy the 14900ks with an apex encore and 8400mhz ram. Yeah I know, it stands stupid, but that was your argument when it comes to MT, that either you go all in and buy 7950x or TR, or you buy a 7600x.


No im not saying MT performance is the sole decider, but when everything else is pretty much identical (including the price) then....

Anyways, your original argument was that there is no segment in which Intel is the better choice, and that's just obviously wrong, quite the opposite is true, Intel is the better choice in most segments.

Seriously, if i didn't know that you were a hardcore intel fanboy, i'd have accused you of trolling...

There is no segment in which intel is the better choice.
 
Seriously, if i didn't know that you were a hardcore intel fanboy, i'd have accused you of trolling...

There is no segment in which intel is the better choice.
Intel has literally the fastest app CPUS on every price segment (every single one, lol) according to this very reputable website. That alone makes you automatically wrong. What the hell man...I'm the fanboy but you are saying the fastest CPUs per price on every single segment aren't the better choice, the slower cpus are. Why? Cause they have the AMD logo on the box? Give me a break man, please.

You go ahead, buy the slower CPUs that consume 40w more from the wall even on idle doing nothing. Makes sense, great deal, just keep it to yourself.
 
Intel has literally the fastest app CPUS on every price segment (every single one, lol) according to this very reputable website. That alone makes you automatically wrong. What the hell man...I'm the fanboy but you are saying the fastest CPUs per price on every single segment aren't the better choice, the slower cpus are. Why? Cause they have the AMD logo on the box? Give me a break man, please.

You go ahead, buy the slower CPUs that consume 40w more from the wall on idle. Makes sense, great deal, just keep it to yourself.

And again, doesn't matter to the vast majority of users. They either need a cheap home office pc, or a gaming pc - for the first amd makes more sense with the better igpus, and for the latter amd makes more sense with vastly better value / performance in gaming.
 
And again, doesn't matter to the vast majority of users. They either need a cheap home office pc, or a gaming pc - for the first amd makes more sense with the better igpus, and the the latter amd makes more sense with vastly better value / performance in gaming.
So the faster cpus on 99% of tasks doesn't matter, but what matters is whos faster with a 2000$ gpu at 720p. And AMD isn't faster even at that ANYWAYS. Both the 13 and the 14600k are faster than the 7600, faster than the 7600x, faster than the 7700x, faster than the 7900x, faster than the 7950x. Numbers from this very website...


So the 14600k is faster (sometimes substantially) than everything else in it's price range in both games, MT apps, ST apps, doesn't consume 40w sitting there idle but it's not the best choice cause of the wrong logo on the box. Kk
 
Back
Top