Wednesday, March 13th 2024

Intel Core i7-14700K Slides Down to $389

The sub-$400 desktop processor segment is heating up, with the recent arrival of the AMD Ryzen 9 7900X3D 12-core/24-thread processor in this segment, at $389. The 7900X3D boasts of 3D V-cache technology, and is tested to offer gaming performance comparable to the Core i9-13900K. Did you know which other chip offered the same performance as the i9-13900K at a much lower price? The Core i7-14700K. Pricing of this chip is on a downward trend, and Newegg is selling it for $389. The chip is listed for $399, with a coupon shaving off $10. At $389, the i7-14700K should offer comparable gaming performance to the i9-13900K, and by extension, the 7900X3D.

The Core i7-14700K "Raptor Lake Refresh" processor features an interesting 8P+12E core-configuration, with 8 "Raptor Cove" performance cores, and 12 "Gracemont" efficiency cores. Each P-core has 2 MB of dedicated L2 cache, each of the three E-core clusters shares a 4 MB L2 cache among its four cores; while the eight P-cores and three E-core clusters share a 33 MB L3 cache. The i7-14700K is compatible with all Intel 600-series and 700-series chipset motherboards, with some of them requring a UEFI firmware update. An interesting point to note here, is that while the i7-14700K is selling at $389, its sibling without the iGPU, the i7-14700KF, remains at $399.
Source: Tom's Hardware
Add your own comment

35 Comments on Intel Core i7-14700K Slides Down to $389

#1
ratirt
Good that the price's going down. What Intel should consider is bumping up the amount of cache the CPUs have instead of ecores. I'm sure it would increase gaming performance and some other apps.
Posted on Reply
#2
Space Lynx
Astronaut
ratirtGood that the price's going down. What Intel should consider is bumping up the amount of cache the CPUs have instead of ecores. I'm sure it would increase gaming performance and some other apps.
They are with Arrow Lake that comes out in October or November this year. Intel 20A node I think.
Posted on Reply
#3
ExcuseMeWtf
Unlike 14900KS, might actually be worth a look.
Posted on Reply
#4
Broken Processor
For gaming unless you live and breath team blue your still going to want a 7800x3d.
Posted on Reply
#5
JustBenching
Broken ProcessorFor gaming unless you live and breath team blue your still going to want a 7800x3d.
13600k and put the rest into a better gpu
Posted on Reply
#6
ratirt
fevgatos13600k and put the rest into a better gpu
I'd go with 7600x instead since it is a tiny bit faster and seems cheaper where I live. Newer platform with plenty upgrade possibility in the long run.
Posted on Reply
#7
JustBenching
ratirtI'd go with 7600x instead since it is a tiny bit faster and seems cheaper where I live. Newer platform with plenty upgrade possibility in the long run.
Still, for gaming you don't want the 7800x 3d. Which was the point
Posted on Reply
#8
ratirt
fevgatosStill, for gaming you don't want the 7800x 3d. Which was the point
It costs more but at the same time it is much faster. If you have a 4090 for instance you definitely want 7800x3d.
Gaming is the only reason you would want to have 7800x3d.
Posted on Reply
#9
JustBenching
ratirtIt costs more but at the same time it is much faster. If you have a 4090 for instance you definitely want 7800x3d.
Gaming is the only reason you would want to have 7800x3d.


Is it?
Posted on Reply
#10
ratirt
fevgatos

Is it?
You should have mentioned 4K but you didnt did you? You didnt also mention RT did you?
X3D CPUs are the fastest for gaming and that's a fact either you like it or not.
Posted on Reply
#11
JustBenching
ratirtYou should have mentioned 4K but you didnt did you? You didnt also mention RT did you?
X3D CPUs are the fastest for gaming and that's a fact either you like it or not.
What resolution are you going to play at with a 4090? The averages only have 2 games with RT btw.
Posted on Reply
#12
EatingDirt
fevgatosWhat resolution are you going to play at with a 4090? The averages only have 2 games with RT btw.
Some people play at 1440p for high FPS. The 7800X3D is 10% faster in this case.

Some people play in Ultrawide resolutions that fall between 1440p & 4k, which might show a CPU bottleneck.

Some people are going to upgrade to something more powerful than a 4090 in the future, in which case, the 5090 may see some/more games that are limited by the CPU at 4k.

Bonus for the 7800X3D is that it uses ~30% less power during gaming than the 13600/14600k, per TPU aggregate.
Posted on Reply
#13
Vayra86
Lovely how some people think they have the truth against all odds because a canned run in a defined benchmark suite for gaming isn't showing much difference in CPU performance. I also love how they think you only benefit from a faster CPU if you run the fastest GPU. Its speaks volumes of how broad the perspective of said people really is: Review Reality. I do wonder if they ever play real games beyond a 15 minute run to look at the graphics.

In the real world though you see several use cases where the fastest possible CPU matters, sometimes even more than having the fastest GPU.
It also applies to ANY game you want to play at high FPS and have no issues dropping image quality to get there.

But nah, don't get a 7800X3D, its only the fastest thing you can have for gaming across the board, most notably in the situations where it matters because every CPU is brought to its knees in said game loads.

Better off buying a sub par Intel CPU that guzzles power to reach performance parity. :roll: :roll: How lost can you get. There is barely any gaming use case where Intel has a better offer right now.
Posted on Reply
#14
JustBenching
Vayra86Lovely how some people think they have the truth against all odds because a canned run in a defined benchmark suite for gaming isn't showing much difference in CPU performance. I also love how they think you only benefit from a faster CPU if you run the fastest GPU. Its speaks volumes of how broad the perspective of said people really is: Review Reality. I do wonder if they ever play real games beyond a 15 minute run to look at the graphics.

In the real world though you see several use cases where the fastest possible CPU matters, sometimes even more than having the fastest GPU.
It also applies to ANY game you want to play at high FPS and have no issues dropping image quality to get there.

But nah, don't get a 7800X3D, its only the fastest thing you can have for gaming across the board, most notably in the situations where it matters because every CPU is brought to its knees in said game loads.

Better off buying a sub par Intel CPU that guzzles power to reach performance parity. :roll: :roll: How lost can you get. There is barely any gaming use case where Intel has a better offer right now.
I agree with what you are saying but, one paragraph goes completely against the next. You are basically claiming, and I agree with you, that reviews are just canned benchmarks and don't show real world performance. But then how do you know the x3d is the fastest gaming cpu in real world?

2 days ago we tested an x3d against a stock 12900k in cyberpunk. Performance was exactly the same. According to TPU though, the 7800x 3d is 30% faster in that game. So your real world scenario goes completely against the point you are making
Posted on Reply
#15
Vayra86
fevgatosI agree with what you are saying but, one paragraph goes completely against the next. You are basically claiming, and I agree with you, that reviews are just canned benchmarks and don't show real world performance. But then how do you know the x3d is the fastest gaming cpu in real world?

2 days ago we tested an x3d against a stock 12900k in cyberpunk. Performance was exactly the same. According to TPU though, the 7800x 3d is 30% faster in that game. So your real world scenario goes completely against the point you are making
Maybe we oughta stop looking at Shittypunk for eternity, because frankly, it doesn't matter what that one game does - every CPU gets decent perf on it and you're more likely GPU limited. Its not a benchmark for anything else in the world either: proprietary engine, loaded to the brim with custom and tweaked-for-a-GPU-vendor code, etc etc etc. Not to mention the fact the game is still full of bugs and weird behaviour, a lot of it attributable to that same weird game logic they got going on. Let's not even begin to mention the scheduling issues surrounding E cores and what not. Basically the game needed TLC for every single hardware change.

When you look at the benches that focus on real gaming situations that are specifically CPU heavy, such as 4X/strategy end-game, games that move lots of assets and game logic all the time, you see the X3D's shine. Its precisely the kind of situations you won't find reviewers doing, because they'd have to nick a savegame somewhere or actually play a game for 3-4 hours to get to that point.
Posted on Reply
#16
JustBenching
Vayra86Maybe we oughta stop looking at Shittypunk for eternity, because frankly, it doesn't matter what that one game does - every CPU gets decent perf on it and you're more likely GPU limited. Its not a benchmark for anything else in the world either: proprietary engine, loaded to the brim with custom and tweaked-for-a-GPU-vendor code, etc etc etc. Not to mention the fact the game is still full of bugs and weird behaviour, a lot of it attributable to that same weird game logic they got going on. Let's not even begin to mention the scheduling issues surrounding E cores and what not. Basically the game needed TLC for every single hardware change.

When you look at the benches that focus on real gaming situations that are specifically CPU heavy, such as 4X/strategy end-game, games that move lots of assets and game logic all the time, you see the X3D's shine. Its precisely the kind of situations you won't find reviewers doing, because they'd have to nick a savegame somewhere or actually play a game for 3-4 hours to get to that point.
But according to TPU the x3d should be 30% faster. In reality it isn't, and no we weren't GPU bound. Whether the game should be used as a benchmark is irrelevant, the point you (and I was making) was the difference between reviews and reality.

It's not true either that the x3d shines in those games. Factorio, which has become the foremost game to show the x3ds prowess shows the exact opposite of what you are claiming. In reviews the x3d is twice as fast as their competition. In real world situations after 4 hours of gameplay in big maps the 14900k is actually faster even in factorio. The game has its own database of performance based on the size of the map, and the "goal" is to maintain 60 fps with the biggest map available. Intel takes the top spots there. The x3d only does well in the small 10k maps with only a handful of units.

So the question is, how do you know that the 3d does better in real world benchmarks? Have you actually tested the 2 cpus? I've posted some numbers from real gameplay on hogwarts, tlou, cyberpunk, kcd, starfield the x3d can barely match a 12900k. Are these games flawed too?
Posted on Reply
#17
Vayra86
fevgatosBut according to TPU the x3d should be 30% faster. In reality it isn't, and no we weren't GPU bound. Whether the game should be used as a benchmark is irrelevant, the point you (and I was making) was the difference between reviews and reality.

It's not true either that the x3d shines in those games. Factorio, which has become the foremost game to show the x3ds prowess shows the exact opposite of what you are claiming. In reviews the x3d is twice as fast as their competition. In real world situations after 4 hours of gameplay in big maps the 14900k is actually faster even in factorio. The game has its own database of performance based on the size of the map, and the "goal" is to maintain 60 fps with the biggest map available. Intel takes the top spots there. The x3d only does well in the small 10k maps with only a handful of units.

So the question is, how do you know that the 3d does better in real world benchmarks? Have you actually tested the 2 cpus? I've posted some numbers from real gameplay on hogwarts, tlou, cyberpunk, kcd, starfield the x3d can barely match a 12900k. Are these games flawed too?
I've had hands on in TW WH3 and Stellaris mostly on the 5800X3D, and briefly on the 7800X3D. Especially in Stellaris, the differences were staggering (and much needed, the game crawls after year 2300-2350 already).

Its interesting, your input. Perhaps my view is clouded too. Defo going to look more into this :)
Posted on Reply
#18
JustBenching
Vayra86I've had hands on in TW WH3 and Stellaris mostly on the 5800X3D, and briefly on the 7800X3D. Especially in Stellaris, the differences were staggering (and much needed, the game crawls after year 2300-2350 already).

Its interesting, your input. Perhaps my view is clouded too. Defo going to look more into this :)
Go check the stelaris benchmark database, it has results per map. On the 60k maps intel cpus are at the top, in the small 10k tops the x3d is twice as fast as intel.

Thing with stelaris is, even if you play 10k maps, the game on the official servers is played at 60 fps. So the fact that the x3d can get 500 is irrelevant. But in the big 60k maps that a lot of cpus actually struggle to get that 60, the x3d doesn't have the lead anymore.

Anyways we are being very specific, my question was more like, how do we know which is better in the real world unless we specifically test them for that?
Posted on Reply
#19
Vayra86
fevgatosGo check the stelaris benchmark database, it has results per map. On the 60k maps intel cpus are at the top, in the small 10k tops the x3d is twice as fast as intel.

Thing with stelaris is, even if you play 10k maps, the game on the official servers is played at 60 fps. So the fact that the x3d can get 500 is irrelevant. But in the big 60k maps that a lot of cpus actually struggle to get that 60, the x3d doesn't have the lead anymore.

Anyways we are being very specific, my question was more like, how do we know which is better in the real world unless we specifically test them for that?
In the end, to me, personally, the bottom line matters much more than the possibly achievable top end in terms of FPS. I would much prefer reviewers create (gaming) benchmark suites that put heavy focus on those CPU-killer situations. A walk through town in TW3. A Stellaris endgame on various map sizes. The CPU hit caused by RT. Etc.

Also, much greater focus on minimums and frametime impact more so than averages, for mostly the same reasons as GPU reviews having put focus on that.
Posted on Reply
#20
JustBenching
Vayra86In the end, to me, personally, the bottom line matters much more than the possibly achievable top end in terms of FPS. I would much prefer reviewers create (gaming) benchmark suites that put heavy focus on those CPU-killer situations. A walk through town in TW3. A Stellaris endgame on various map sizes. The CPU hit caused by RT. Etc.

Also, much greater focus on minimums and frametime impact more so than averages, for mostly the same reasons as GPU reviews having put focus on that.
That's why I like testing cyberpunk. Tom's dinner max populated with RT shows 100% usage on 6core chips. A 7600x I tested dropped to below 60 fps!!
Posted on Reply
#21
ratirt
fevgatosWhat resolution are you going to play at with a 4090? The averages only have 2 games with RT btw.
If you buy 4090 you most likely will try playing RT on and that would not be 4k but rather 1440p and even 1080p and this will surely bottleneck the CPU so you would need X3D.
Also, future upgrades. If you have 4080 or 4090, you might jump to a faster one in a year or two. Then the CPU bottleneck will be even more evident.
The 14700 seems like it is something OK for gaming but the price is still high and X3D would still be better purchase. Also the upgrade from a 14700 isn't in any way comprehensive. Same goes for 13600
Posted on Reply
#22
JustBenching
ratirtIf you buy 4090 you most likely will try playing RT on and that would not be 4k but rather 1440p and even 1080p and this will surely bottleneck the CPU so you would need X3D.
Also, future upgrades. If you have 4080 or 4090, you might jump to a faster one in a year or two. Then the CPU bottleneck will be even more evident.
The 14700 seems like it is something OK for gaming but the price is still high and X3D would still be better purchase. Also the upgrade from a 14700 isn't in any way comprehensive. Same goes for 13600
The x3d is slower in cyberpunk.
Posted on Reply
#23
ratirt
fevgatosThe x3d is slower in cyberpunk.
Slower than which CPU? 14900K? According to you it is and most reviewers and benchmarks, unfortunately, can't support your theory.
but sure, CP2077 is the game that describes performance reality. Not likely.
Posted on Reply
#24
JustBenching
ratirtSlower than which CPU? 14900K? According to you it is and most reviewers and benchmarks, unfortunately, can't support your theory.
but sure, CP2077 is the game that describes performance reality. Not likely.
Than everything Intel, we tested it with dragam. Even a 12900k matches the x3d :D
Posted on Reply
#25
EatingDirt
fevgatosThan everything Intel, we tested it with dragam. Even a 12900k matches the x3d :D
Don't know what you're doing wrong, considering TPU's numbers show the 7800X3D light years ahead of even the new 14900KS in Cyberpunk at 720p, 1080p & 1440p. Unless of course, you're just using 4k numbers, in which case a 'win' is within the margin of error, and not really a win at all.
Posted on Reply
Add your own comment
Nov 21st, 2024 12:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts