• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-13900K

I feel so bad with myself, but I saw 32GB DDR5 6400 at 325€ and I've cancelled my economic TUF Z790 Plus D4 and switched for a ROG Z790-F... I hope it worth it
 
Its still strange this K CPU is marketed as a 125W CPU, which it really isn't.

Also, the perf/watt isn't better, it's worse; objectively worse at MT; and at ST-focused gaming, the 5800X3D is still better, as is even a 7600X

View attachment 266368
Now play an actual game, and divide the power draw by the FPS, then show me what you get.


Or better yet, let the best technical youtuber do it for you.
 
efficiency-gaming.png


One of the most inefficient gaming CPU TPU has tested.
 
Now play an actual game, and divide the power draw by the FPS, then show me what you get.


Or better yet, let the best technical youtuber do it for you.
See that's where our approaches of looking at this depart.

Peak FPS at the best possible GPU in a CPU limited situation. == Theoretical
vs
Useful FPS for the games you actually play in practice on your GPU. == Practical

The peak FPS is obtained by grossly exceeding what you would get in a typical system.
The energy efficiency tests are taken on a much more reasonable RTX3080. So when there isn't a fight for peak clocks/perf, Intel's xx900k falls apart in every way.
 
efficiency-gaming.png


One of the most inefficient gaming CPU TPU has tested.
Wait for the 4090 tests.

See that's where our approaches of looking at this depart.

Peak FPS at the best possible GPU in a CPU limited situation. == Theoretical
vs
Useful FPS for the games you actually play in practice on your GPU. == Practical
Why would you buy a $600 13th gen, 24 core CPU to pair it with a budget GPU to game or work on?

It's almost like testing a new CPU generation in a CPU limited situation is the ideal situation?

There's nothing theoretical about der8auer's tests. They're tests, end of story.

The 4090 is simply the first GPU from the new generations to be tested, lower end ones that still perform better than a 3080 are certainly coming.
 
Wait for the 4090 tests.


Why would you buy a $600 13th gen, 24 core CPU to pair it with a budget GPU to game or work on?

There's nothing theoretical about der8auer's tests. They're tests, end of story.
A 3080 isn't a budget GPU.
Also, let's appreciate the fact that test is already done at 1080p on that GPU, which it eats for breakfast, which is practical translation for: you really won't be needing even more CPU based FPS in your gaming.
 
Wait for the 4090 tests.


Why would you buy a $600 13th gen, 24 core CPU to pair it with a budget GPU to game or work on?

It's almost like testing a new CPU generation in a CPU limited situation is the ideal situation?

There's nothing theoretical about der8auer's tests. They're tests, end of story.

The 4090 is simply the first GPU from the new generations to be tested, lower end ones that still perform better than a 3080 are certainly coming.
Bruh just take the L man, you've completely lost sight of the argument and the tests from your OWN WEBSITE say you're wrong. A 4090 wont change that.
 
A 3080 isn't a budget GPU.
It's a previous generation GPU that isn't fast enough to put the CPU in a CPU limiting situation for testing. Hence why we're going to be testing with the 4090.

Bruh just take the L man, you've completely lost sight of the argument and the tests from your OWN WEBSITE say you're wrong. A 4090 wont change that.
What L? The same people who complain about the 4090 450 W draw, that can go up to 600 W also pretend to not notice that it's also putting out twice the performance of previous generation cards.
Who cares if the 13900K can draw 300 W? OK? So?
The chip is still faster than any other CPU in gaming when limited to it's PL1 of 125 W.
 
Wait for the 4090 tests.


Why would you buy a $600 13th gen, 24 core CPU to pair it with a budget GPU to game or work on?

There's nothing theoretical about der8auer's tests. They're tests, end of story.

The 4090 is simply the first GPU from the new generations to be tested, lower end ones that still perform better than a 3080 are certainly coming.
If your ultimate goal is gaming efficiency then you'd also need to compare 7950X with only 1 CCD, (I would guess) disabling Intel E-cores, and then also undervolting/limiting both to their most efficient mode.

But just because you buy a $600 CPU doesn't mean you will buy a $1500 GPU. The opposite is true (why saddle fast GPU and slow CPU) but there are lots of reasons to max out the CPU and not GPU, specifically all the workstation tasks this very site tests.
 
Was going to upgrade to one of these from my 5800X3D because im a 360hz 1080p gamer and want the absolute highest fps but fuck me. Those temps and power usage. Hard pass now
 
So use the Intel power limits that motherboard manufacturers disable by default???? Profit.


Are you unable to change a single setting in the BIOS? Didn't realise the enthusiast community was forced to run everything at stock?
Your evasive bs answer is disingenuous.

You know full well I already do such as I can to get the most performance for the least power, as Most enthusiasts here do.

That's not where my problem lies with power guzzling CPUs, technically I agree with your point regarding power not being important if it gets the job done.

My issue lies with the average Joe buying this because Intel crancked all the dials to be all Intel number one about it, but, using a effing sledge hammer, they're algorithm is a bit shit IMHO these could and should have been way more efficient in way more use cases, it's like they don't know how to power gate or when again IMHO.
 
It's a previous generation GPU that isn't fast enough to put the CPU in a CPU limiting situation for testing. Hence why we're going to be testing with the 4090.
Hey I get that. This isn't a question of who's right or wrong, both ways of looking at it are valid. It depends on your use case. I think that detail is a fine thing to recognize. That is all.

The reality is, if you're not coupling this CPU with a 4090, you're doing it wrong in the gaming section. And, on top of that, you aren't concerned with perf/watt at all. Even a 4090 won't get you CPU constrained gaming, realistically; I mean... it can't even hit 60 FPS in Cyberpunk :D
 
Your evasive bs answer is disingenuous.

You know full well I already do such as I can to get the most performance for the least power, as Most enthusiasts here do.

That's not where my problem lies with power guzzling CPUs, technically I agree with your point regarding power not being important if it gets the job done.

My issue lies with the average Joe buying this because Intel crancked all the dials to be all Intel number one about it, but, using a effing sledge hammer, they're algorithm is a bit shit IMHO these could and should have been way more efficient in way more use cases, it's like they don't know how to power gate or when again IMHO.
The average Joe shouldn't be buying a K series CPU. The i9 13900 (non K), and other i5/7/9 chips will be an excellent choice for them, and will be what is used in most OEM systems that aren't marketed towards gamers. The average Joe isn't going to be building a PC either. The people who do, know how to tune or ask advice, for the most part.

I agree with your statement about pushing chips past their efficiency curves, but everyone, Intel, AMD, NVIDIA etc., is doing this. For one player to go the efficient route when the rest are going the performance route is bad marketing.

There's nothing BS about anything I've said.

If your ultimate goal is gaming efficiency then you'd also need to compare 7950X with only 1 CCD, (I would guess) disabling Intel E-cores, and then also undervolting/limiting both to their most efficient mode.

But just because you buy a $600 CPU doesn't mean you will buy a $1500 GPU. The opposite is true (why saddle fast GPU and slow CPU) but there are lots of reasons to max out the CPU and not GPU, specifically all the workstation tasks this very site tests.
Personally I think if you're the use case that can fully utilize and profit from a 24 core CPU, you probably also have the budget for a Quadro equivalent, or see the 24 GB frame buffer of the xx90 series cards as being worth whatever NVIDIA chooses to sell them for.

I think the main issue is that everyone is evaluating these chips from their own budgets and needs. If you're not a HFR gamer or a person who actually makes money from the processing power of their computer - this chip isn't for you.
 
So basically, if you want peak performance, your man cave is a furnace and you still only gained what, 10% over any other sane CPU.
Hey I hear Winter is coming, we'll ignore the last 2 seasons of GoT for now, so this is a nice investment in a relatively cheap heater? Though I guess your electricity bills will probably bankrupt you as well :nutkick:

No I'm, pretty sure someone said 40% more at similar wattage.
Who cares if the 13900K can draw 300 W? OK? So?
I'd like to go full Greenpeace on you but I'll just add any sane person would, those who don't care for the environment don't deserve respect IMO!
 
Now play an actual game, and divide the power draw by the FPS, then show me what you get.


Or better yet, let the best technical youtuber do it for you.

Very much what I thought would be the case. Intel chased benchmarks a bit over aggressively and at stock not the best balance mix, but with the right power limit and undervolt should be really good and with tons of manual optimization. A stock and especially with unlimited power draw it's crazy. Intel should've really tried to better balance out the sweet spot balance. I'd still say it's good in the right hands willing to optimize it though. Will it be enough when X3D arrives at least if making the same comparison probably not, but until then it's the best chip you can get currently just not on value for dollar perhaps. The 13600K with this same approach makes for a compelling chip.
 
It's a previous generation GPU that isn't fast enough to put the CPU in a CPU limiting situation for testing. Hence why we're going to be testing with the 4090.


What L? The same people who complain about the 4090 450 W draw, that can go up to 600 W also pretend to not notice that it's also putting out twice the performance of previous generation cards.
You're claiming the 13900k is more efficient. Graphs from your own site prove you to be wrong, and your response is "well wait for the next GPU" instead of simply admitting that you were wrong.
Who cares if the 13900K can draw 300 W? OK? So?
Who cares? How about anyone who has to cool these monsters? There's plenty of legitimate criticism of that level power draw in this very comment section. Being "the fastest" doesnt negate these concerns.
The chip is still faster than any other CPU in gaming when limited to it's PL1 of 125 W.
According to a single youtube source. I'd honestly expect better from TPU's proofreader.
There's nothing BS about anything I've said.
You're being disingenuous, petty, dismissive, and combative with your readers. That is, quite frankly, the level of bullshit I'd expect from WCCFtech or Reddit, not techpowerup.

Also, you claiming that the 13900k has better perf/watt when your own site says otherwise smells pretty strongly of bullcrap. Again, read your own site, Mr. proofreader.
 
Hey I hear Winter is coming, we'll ignore the last 2 seasons of GoT for now, so this is a nice investment in a relatively cheap heater? Though I guess your electricity bills will probably bankrupt you as well.

No I'm, pretty sure someone said 40% more at similar wattage.

I'd like to go full Greenpeace on you but I'll just add any sane person would, those who don't care for the environment don't deserve respect IMO!
So stop driving and buying foodstuffs that have been shipped across the world using fuel oil burning container ships, forget about fast fashion, ask your governments why they've not been building nuclear power plants, or just flip a setting in the BIOS that enables PL1/2 limits to actually be enforced. Or better yet, buy a non-K CPU that will be almost as fast in almost every situation, while not having that last 5% of performance for a silly power budget. You guys need to evaluate this product in the context of its target demographic.
 
You're claiming the 13900k is more efficient. Graphs from your own site prove you to be wrong, and your response is "well wait for the next GPU" instead of simply admitting that you were wrong.

Who cares? How about anyone who has to cool these monsters? There's plenty of legitimate criticism of that level power draw in this very comment section. Being "the fastest" doesnt negate these concerns.

According to a single youtube source. I'd honestly expect better from TPU's proofreader.

You're being disingenuous, petty, dismissive, and combative with your readers. That is, quite frankly, the level of bullshit I'd expect from WCCFtech or Reddit, not techpowerup.

Also, you claiming that the 13900k has better perf/watt when your own site says otherwise smells pretty strongly of bullcrap. Again, read your own site, Mr. proofreader.
Exactly.

Nuance. That is what was missing. I mean, yes I can read charts, and omg its fastuhhrr but what's underneath?
 
You're claiming the 13900k is more efficient. Graphs from your own site prove you to be wrong, and your response is "well wait for the next GPU" instead of simply admitting that you were wrong.

Who cares? How about anyone who has to cool these monsters? There's plenty of legitimate criticism of that level power draw in this very comment section. Being "the fastest" doesnt negate these concerns.

According to a single youtube source. I'd honestly expect better from TPU's proofreader.

You're being disingenuous, petty, dismissive, and combative with your readers. That is, quite frankly, the level of bullshit I'd expect from WCCFtech or Reddit, not techpowerup.

Also, you claiming that the 13900k has better perf/watt when your own site says otherwise smells pretty strongly of bullcrap. Again, read your own site, Mr. proofreader.
So you're questioning the expertise of der8auer? He's a qualified engineer who advises motherboard, GPU and CPU manufacturers on products. What are your qualifications?

People who value "the fastest", which this is, have the ability to cool these monsters.

I'm challenging your assertions with evidence.

You are doing the same.

This is neither petty nor disingenuous.
 
So stop driving and buying foodstuffs that have been shipped across the world using fuel oil burning container ships, forget about fast fashion, ask your governments why they've not been building nuclear power plants, or just flip a setting in the BIOS that enables PL1/2 limits to actually be enforced. Or better yet, buy a non-K CPU that will be almost as fast in almost every situation, while not having that last 5% of performance for a silly power budget. You guys need to evaluate this product in the context of its target demographic.
Nope, we can evaluate this product in the context of the time and the world it's being released in, too. In fact, companies would do wise to look at that context, and there are good examples of companies and new businesses that thrive because of doing so. The fact chip progress hasn't quite yet, is a fact. A worrying one.

Tech enthusiast does not equal 'I buy the top end part every time', sir.

I'm on a tech site, so why would I talk about oil or fashion here?
 
Hey I get that. This isn't a question of who's right or wrong, both ways of looking at it are valid. It depends on your use case. I think that detail is a fine thing to recognize. That is all.

The reality is, if you're not coupling this CPU with a 4090, you're doing it wrong in the gaming section. And, on top of that, you aren't concerned with perf/watt at all. Even a 4090 won't get you CPU constrained gaming, realistically; I mean... it can't even hit 60 FPS in Cyberpunk :D
Nope, we can evaluate this product in the context of the time and the world it's being released in, too. In fact, companies would do wise to look at that context, and there are good examples of companies and new businesses that thrive because of doing so. The fact chip progress hasn't quite yet, is a fact. A worrying one.

Tech enthusiast does not equal 'I buy the top end part every time', sir.

I'm on a tech site, so why would I talk about oil or fashion here?
That's your right. As is my right to evaluate it in the context I see fit.
 
That's your right. As is my right to evaluate it in the context I see fit.
You're still part of a community buddy. This was feedback.

For a staff member, a more open stance could be expected. We (or at least, I) am not jabbing at you, its about the content we're discussing and what might make it more valuable. Insights, you know. I think that gaming energy efficiency result is a very interesting thing to highlight, because it points out the limited necessity for going this high up the Intel stack for 'better gaming'. Seeing as we are speaking of a gaming champ. Could less actually be more? ;)
 
Last edited:
You guys need to evaluate this product in the context of its target demographic.
Which is for who ~ brain dead Zombies?
So stop driving and buying foodstuffs that have been shipped across the world using fuel oil burning container ships
I don't do that, never bought anything to eat that was shipped from outside India. Yes the car may have some imported parts, dunno for sure.
forget about fast fashion
Don't do fast fashion, in fact I hate buying clothes. I still have 15 year old clothes in perfectly good condition that I can wear at home!
ask your governments why they've not been building nuclear power plants
That's a tad, or quite a lot, harder than you think.
just flip a setting in the BIOS that enables PL1/2 limits to actually be enforced.
Or better yet don't buy this freaking thing!
Or better yet, buy a non-K CPU that will be almost as fast in almost every situation, while not having that last 5% of performance for a silly power budget.
Like I said at the beginning ~ for brain-dead "morons" :rolleyes:

Just for clarification I will add that I do spend a disproportionate amount on phones ~ I buy a phone each year & sell the old phones in exchange. Though tbf that's accounting for the whole family, I just keep the latest one since I'm paying :pimp:
 
Last edited:
Back
Top