• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

9800x 3d vs 12900k - Battle of the Century

There are many issues for AMD users as well for 24H2.
I don't know about that, but going from 22h2 to 24h2 there was a huge drop in performance for Intel. And windows 10 was a lot faster than win 11 too. 16% performance difference from win 11 24h2 to windows 10 - same game same scene same settings.
 
A fix for 24h2 horrible Intel performance
it doesn't work, what worked for me is a specific version of the 24H2 preview. (for my Intel Systems)

though my ROG ally benefited from the latest 24H2 update, my FPS is now 1.5x than what I have in 23H2 and I now use lower TDP than before to push the game to have better FPS..
 
I don't know about that, but going from 22h2 to 24h2 there was a huge drop in performance for Intel. And windows 10 was a lot faster than win 11 too. 16% performance difference from win 11 24h2 to windows 10 - same game same scene same settings.
WIn10 is also faster for AMD in some Games. I feel all of the telemetry they are baking into MS products is the cause. Even at work we are experiencing some serious Gremlins since upgrading to 365 with 24H2. I don't know what is going on and I can't dispute whether Gaming has improved or regressed as the Games I play feel the same but Windows 11 is not what Windows wants it to be or thinks it is. Xp was so popular because it was basically inert. Now from a Windows 11 install it can turn your desktop into a laptop by installing Vendor software that started in Laptops in fact Windows will ask you. Recently, because I have more than one Hotmail account MS put an update on my PC that wanted me to pay subscription fee to remove the data that they said was too large. I have more than 1 PC though so it was a nothing burger at the end but 24h2 needs to be heavily refined and debugged.
 
it doesn't work, what worked for me is a specific version of the 24H2 preview. (for my Intel Systems)

though my ROG ally benefited from the latest 24H2 update, my FPS is now 1.5x than what I have in 23H2 and I now use lower TDP than before to push the game to have better FPS..
Ah ok, I have windows 10 now so can't check myself, I just posted it if someone wants to try.

The main issue is that the CPU keeps pulling the same power under win 11 even though performance is much worse than windows 10. So what the heck is the CPU spending cycles on?

I'll also test win 10 vs 11 for the 9800x 3d and share my results.
 
So what the heck is the CPU spending cycles on?
its not spending cycles onto something, its wasting cycles and running those calculations/predictions unoptimized/inefficiently.
 
\\
I don't know if it's a big ask but do you perhaps have kingdom come installed?
I'm doing research on my next cpu upgrade and a thread came up about something called "amdip" which seems to be a bit of a controversial topic.
He runs trough a specific gate that's causing him a dip that supposedly intel doesn't suffer from.
If you have time could you perhaps go in the same spot in-game with your 9800x3d and see if you get the same dip?I was told the man in the video doesn't know how to tune amd cpu's and that its's probably flck related.
Thanks in advance.
Just so you know, since I tested this scene way before copechasers (i have videos on my channel from last year) - I can tell you he has doctored the results 100%. See when the game begins - the mother of the protagonist is in the house. She slowly walks outside and heads towards the village. In the 3d run that is indeed the case (13:08 of his video). You can see her again at 14:06 doing her usual walk towards the village. That's all great.

But now if you watch his 14900k run - the mum has already gone to the village and is on the way back (14:45 of the video). Which means he is being testing this scene for 2+ minutes, he has gone to the archway - preloaded everything - then went back to the house and pretended it's his first run and the intel chip doesn't dip.

God this guy is sleazy

EG1. It actually takes 4 minutes - you need to spend 4 minutes in game to meet her in the archway.
 
Last edited:
\\

Just so you know, since I tested this scene way before copechasers (i have videos on my channel from last year) - I can tell you he has doctored the results 100%. See when the game begins - the mother of the protagonist is in the house. She slowly walks outside and heads towards the village. In the 3d run that is indeed the case (13:08 of his video). You can see her again at 14:06 doing her usual walk towards the village. That's all great.

But now if you watch his 14900k run - the mum has already gone to the village and is on the way back (14:45 of the video). Which means he is being testing this scene for 2+ minutes, he has gone to the archway - preloaded everything - then went back to the house and pretended it's his first run and the intel chip doesn't dip.

God this guy is sleazy

He cracks me up but he's basically the userbenchmarks of youtube channels.... If he's making money of suckas more power to him though. Get dat $$$ but in general it just makes people not trust actual good data.

I use to say this a lot but if someone tries hard enough they can make any hardware look good or bad. Reviewers shouldn't need to test a certain way or jump through hoops to make the hardware look better though it should just work and if it doesn't it needed more time in the oven.

I honestly only trust what I observe locally due to stuff like that. Don't get me wrong I love seeing other people's results but if I can't reproduce them either from lack of skill or using a slightly different configuration etc it doesn't really matter what anyone else gets.

It's why looking at one review is so bad you need at least 5-6 good data points to get a general idea how something actually performs. Some might see a 5-8% other might see a 1-3% improvement the truth is probably somewhere in the middle.

I do like how DF test these days though looking for the worst case scenarios I probably care more about that then most though.
 
He cracks me up but he's basically the userbenchmarks of youtube channels.... If he's making money of suckas more power to him though. Get dat $$$ but in general it just makes people not trust actual good data.

I use to say this a lot but if someone tries hard enough they can make any hardware look good or bad. Reviewers shouldn't need to test a certain way or jump through hoops to make the hardware look better though it should just work and if it doesn't it needed more time in the oven.

I honestly only trust what I observe locally due to stuff like that. Don't get me wrong I love seeing other people's results but if I can't reproduce them either from lack of skill or using a slightly different configuration etc it doesn't really matter what anyone else gets.

It's why looking at one review is so bad you need at least 5-6 good data points to get a general idea how something actually performs. Some might see a 5-8% other might see a 1-3% improvement the truth is probably somewhere in the middle.

I do like how DF test these days though looking for the worst case scenarios I probably care more about that then most though.
Yeah, love DF. Their R5 1600 review was the reason I bought the chip. :toast:

Now for the topic at hand, since im going to buy new ram cause why not (ill use the 7600 kit I have for now until new ram arrives), 6400c30 or 6000c28? Im inclined towards the latter, the ICs will be probably similarly binned but I kinda have the feeling the c30 might not be able to drop to 28 at 6400, higher chance the 6000 kits can do 6400c28. Any suggestions?
 
Yeah, love DF. Their R5 1600 review was the reason I bought the chip. :toast:

Now for the topic at hand, since im going to buy new ram cause why not (ill use the 7600 kit I have for now until new ram arrives), 6400c30 or 6000c28? Im inclined towards the latter, the ICs will be probably similarly binned but I kinda have the feeling the c30 might not be able to drop to 28 at 6400, higher chance the 6000 kits can do 6400c28. Any suggestions?

I've seen some conflicting information on that but my guess is whatever gives the lowest latency with X3D vs higher speed is going to probably be better but both will likely be pretty damn close.
 
I've seen some conflicting information on that but my guess is whatever gives the lowest latency with X3D vs higher speed is going to probably be better but both will likely be pretty damn close.
Thing is my adies (7600c36) refuse to drop to c30 even at 6000 speeds with 1.5volts. Which is why I feel like the 28 will probably be the better deal.

Eg1. They can do 8000c34 but 6000c30? Nope, no boot.
 
Thing is my adies (7600c36) refuse to drop to c30 even at 6000 speeds with 1.5volts. Which is why I feel like the 28 will probably be the better deal.

Eg1. They can do 8000c34 but 6000c30? Nope, no boot.

I think if I bought a kit tomorrow it would be a 6000C28 kit cuz I am lazy and don't want to try and make it work with a 6000 CL30 kit lol....
 
Thing is my adies (7600c36) refuse to drop to c30 even at 6000 speeds with 1.5volts. Which is why I feel like the 28 will probably be the better deal.

Eg1. They can do 8000c34 but 6000c30? Nope, no boot.

yeah my 7600 kit was the same, I just got some a-dies -6000 cl28 all day - sold the 7600 kit to the friend buying the 13700k setup.
KLEVV CRAS V RGB DDR5 64GB (2x32GB) 6400MHz CL32 A-DIE 1.35V Gaming Desktop Ram Memory SK Hynix Chip XMP 3.0 / AMD Expo Ready - Black (KD5BGUA80-64A320G) at Amazon.com

Also figured out my hogwarts legacy stutter - moved the game to my primary SSD and all stutters are gone :/

I got a ton of games on that other WD SN850 2TB so wondering why it's stuttering :(
 
Last edited:
So I just tested KCD repeatedly, it takes 4 minutes for the mother of the protagonist to go down to the village, fill the bucket with water and start walking back. You have to be in game for 4 minutes in order to meet her in the archway on her way back. Which means copechasers was running around for 4 minutes with his 14900k, preloaded everything and then went back to the house and started recording pretending he just started the game.

yeah my 7600 kit was the same, I just got some a-dies -6000 cl28 all day - sold the 7600 kit to the friend buying the 13700k setup.
KLEVV CRAS V RGB DDR5 64GB (2x32GB) 6400MHz CL32 A-DIE 1.35V Gaming Desktop Ram Memory SK Hynix Chip XMP 3.0 / AMD Expo Ready - Black (KD5BGUA80-64A320G) at Amazon.com

Also figured out my hogwarts legacy stutter - moved the game to my primary SSD and all stutters are gone :/

I got a ton of games on that other WD SN850 2TB so wondering why it's stuttering :(
So you think my 7600 kits won't work? Like at all?
 
copechasers

I also noticed in his "Rust" video he used two different rendering scenes. Arnt they supposed be the same scene for a direct comparison?

From what I observed, the AMD cpu was rendering scenes from further away than the intel CPU that was just rendering up close rocks lol.

Anyways, interesting video non the less.
 
\\

Just so you know, since I tested this scene way before copechasers (i have videos on my channel from last year) - I can tell you he has doctored the results 100%. See when the game begins - the mother of the protagonist is in the house. She slowly walks outside and heads towards the village. In the 3d run that is indeed the case (13:08 of his video). You can see her again at 14:06 doing her usual walk towards the village. That's all great.

But now if you watch his 14900k run - the mum has already gone to the village and is on the way back (14:45 of the video). Which means he is being testing this scene for 2+ minutes, he has gone to the archway - preloaded everything - then went back to the house and pretended it's his first run and the intel chip doesn't dip.

God this guy is sleazy
That's some fine detective work.

The mother NPC at 14:06 is easy to miss.
 
Last edited:
I also noticed in his "Rust" video he used two different rendering scenes. Arnt they supposed be the same scene for a direct comparison?

From what I observed, the AMD cpu was rendering scenes from further away than the intel CPU that was just rendering up close rocks lol.

Anyways, interesting video non the less.
Ah, yes, haven't watched the whole video.

In Rust it's those megastructures that drop framerates. On the 3d chip he is testing like in front of a "castle", in the 14900k he is testing on an empty beach with nothing but rocks.

So scientific...

That's some fine detective work.

The mother NPC at 14:06 is easy to miss.
I tested it, it takes 4 minutes in order to be able to meet that NPC on her way back. He literally spent 4 minutes in game and then went back to the house and started recording to pretend it's a fresh run :banghead:
 
Last edited:
Also figured out my hogwarts legacy stutter - moved the game to my primary SSD and all stutters are gone :/
If your primary SSD is the 4TB WD SN850X
and your secondary SSD is the WD SN850 2TB

I think the SN850X has dram while the SN850 is dramless.Could that be the issue?only thing I can think off.
 
Just finished my build which was frustrating with the ram issues, so i dont think ill try with any fiddling with those haha.
 
I would love if you could test Troy Total War, it's so CPU-bound (but also scales incredibly well multi-core with grass extreme) it's a great benchmark. You can even do it at high res, it's almost impossible to get GPU-bound with any mid-range GPU (excluding abusing stuff like 8x MSAA @ 4K). Lemme know, I'll prepare you a replay which you can run. :toast:
I tested the in game battle bench - used every single thread of the CPU (hit 170w power draw :kookoo: ), averaged 257 fps on max settings. Is that to your satisfaction?
 
I tested the in game battle bench - used every single thread of the CPU (hit 170w power draw :kookoo: ), averaged 257 fps on max settings. Is that to your satisfaction?
Damn, that's crazy! Thanks for the test. Any chance you could try this replay? The fps differences are just so big vs my R5 7600 (in the low 60s even at 1080p in the battle benchmark). Thing is, I suspect the game scales CPU load with resolution too in some way but it's hard for me to be sure, because my RX6800 is too weak to make a conclusion with even if it's underutilised (both % and W). A minute into the battle you should see FPS go much lower, if you could also switch between 1080p & 4K and note the difference that would be really great. Don't need to watch longer than that, performance remains similar throughout; it's the initial battle clash that sets the tone. Cheers! :rockout:
 

Attachments

Damn, that's crazy! Thanks for the test. Any chance you could try this replay? The fps differences are just so big vs my R5 7600 (in the low 60s even at 1080p in the battle benchmark). Thing is, I suspect the game scales CPU load with resolution too in some way but it's hard for me to be sure, because my RX6800 is too weak to make a conclusion with even if it's underutilised (both % and W). A minute into the battle you should see FPS go much lower, if you could also switch between 1080p & 4K and note the difference that would be really great. Don't need to watch longer than that, performance remains similar throughout; it's the initial battle clash that sets the tone. Cheers! :rockout:
Since the replay doesn't work, I tested again and realized that at 1080p the 4090 is a huge bottleneck. So I dropped to 720p - 12900k = 340 fps average - 282 minimums

Still haven't set up the 9800x 3d, UPS ditched me and im still waiting for the new case.

12900k troy total war.JPG

Damn, that's crazy! Thanks for the test. Any chance you could try this replay? The fps differences are just so big vs my R5 7600 (in the low 60s even at 1080p in the battle benchmark). Thing is, I suspect the game scales CPU load with resolution too in some way but it's hard for me to be sure, because my RX6800 is too weak to make a conclusion with even if it's underutilised (both % and W). A minute into the battle you should see FPS go much lower, if you could also switch between 1080p & 4K and note the difference that would be really great. Don't need to watch longer than that, performance remains similar throughout; it's the initial battle clash that sets the tone. Cheers! :rockout:
Ok so I followed the instructions on your PM. 1 defender 1 attacker - max units - I focused mainly on the middle of the armageddon. When the fight started fps was 200-240 but after a couple of minutes it dropped, average was around 170 but the lowest I saw was 149 fps. I let it drag out for around 6 minutes.
 
Since the replay doesn't work, I tested again and realized that at 1080p the 4090 is a huge bottleneck. So I dropped to 720p - 12900k = 340 fps average - 282 minimums

Still haven't set up the 9800x 3d, UPS ditched me and im still waiting for the new case.



Ok so I followed the instructions on your PM. 1 defender 1 attacker - max units - I focused mainly on the middle of the armageddon. When the fight started fps was 200-240 but after a couple of minutes it dropped, average was around 170 but the lowest I saw was 149 fps. I let it drag out for around 6 minutes.
Thanks a bunch, that's awesome! For me even at 1080p it goes under 30 fps during the battle. Guess the 7600 is a bigger bottleneck with just 6 cores. At 4K it's obviously brutal on the GPU so harder to tell but CPU can easily stay pegged >90% on all cores. If 9950X3D does V$ on both chiplets I might just give in and buy it instead of the 9800X3D.

uTM0WbU.png

oOBCxpe.jpg
 
Thanks a bunch, that's awesome! For me even at 1080p it goes under 30 fps during the battle. Guess the 7600 is a bigger bottleneck with just 6 cores. At 4K it's obviously brutal on the GPU so harder to tell but CPU can easily stay pegged >90% on all cores. If 9950X3D does V$ on both chiplets I might just give in and buy it instead of the 9800X3D.

uTM0WbU.png

oOBCxpe.jpg
I feel it's the ram, I can test with 6 cores if you want.


I would post a video but this game doesn't allow me to record with geforce experience, something is borked.
 
I received all the parts yesterday but damn, the PSU doesn't fit the case. Or rather, the combination of PSU and GPU don't fit the case, they are literally touching each other. Ordered a new PSU :banghead:

So im all set up, first impressions, the chip is scorching hot or I just borked the cooler installation. This is insanity :eek:
 
Looking like I shouldve gone AMD this run lol. But I'm not suire how the 12900 vs the 9800 is the battle. Shouldnt it be Vs the 14900 or the KS? Not that I have high hopes for either of them with the terrible windows performance since the updates lol.
 
Back
Top