• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

9800x 3d vs 12900k - Battle of the Century

Looking like I shouldve gone AMD this run lol. But I'm not suire how the 12900 vs the 9800 is the battle. Shouldnt it be Vs the 14900 or the KS? Not that I have high hopes for either of them with the terrible windows performance since the updates lol.

I don't think he has a Raptor i9 chip anymore. I'd contribute, but I don't have a 4090, so my results would never line up
 
I received all the parts yesterday but damn, the PSU doesn't fit the case. Or rather, the combination of PSU and GPU don't fit the case, they are literally touching each other. Ordered a new PSU :banghead:

So im all set up, first impressions, the chip is scorching hot or I just borked the cooler installation. This is insanity :eek:
Definitely not scorching hot.... would remount cooler.

Corona Benchmark Run - 79C with PBO +200 neg 30 offset
1732068451363.png
 
Last edited:
I don't think he has a Raptor i9 chip anymore. I'd contribute, but I don't have a 4090, so my results would never line up
I have a 4090 14900ks ocd to 5.9-6.3.
46 on ECores.
Memory is ddr5 2x32 (64gb) 6000 ocd to 7000.
Maybe I could?
 
I have a 4090 14900ks ocd to 5.9-6.3.
46 on ECores.
Memory is ddr5 2x32 (64gb) 6000 ocd to 7000.
Maybe I could?

Yeah, that sounds pretty awesome. I'm on the 13KS and 7600 ram, but only 4080
 
Looking like I shouldve gone AMD this run lol. But I'm not suire how the 12900 vs the 9800 is the battle. Shouldnt it be Vs the 14900 or the KS? Not that I have high hopes for either of them with the terrible windows performance since the updates lol.
if you are gaming on 2k and 4k and also above those, you won't benefit the chunk of L3 cache..also if you can't clock the L3 faster it still will perform just average nothing that your current setup would leverage when sidegrading.

EDIT:
@phanbuey what you're using to cool that? that seems hotter than my run (Mid-day run)
Screenshot 2024-11-20 132845.png
 
Last edited:
if you are gaming on 2k and 4k and also above those, you won't benefit the chunk of L3 cache..also if you can't clock the L3 faster it still will perform just average nothing that your current setup would leverage when sidegrading.

EDIT:
@phanbuey what you're using to cool that? that seems hotter than my run (Mid-day run)
View attachment 372512
$45 air cooler - Thermalright Phantom Spirit EVO.

I used the same cooler setup with my 13700KF at 5.5 1.28V and it throttled at a cool 89-92C @230W to get ~9,600,000.

My top 13700kf score below 90C was at 220W cap 5.3 Ghz 1.22v (Voltage not accurate in hwinfo):
1732082090774.png


I have noticed benefits in CPU bound games at 4K - (Remnant 2, Space Marine etc.) Especially if you use DLSS to push 120+ fps w/ 4090. Remnant 2 game nights are for sure smoother as that game loses FPS as you play - by the end of the night it was a 70 FPS strugglefest. Once windows patches RL performance in 11 24H2 it should normalize a bit tho.

The performance is amazing for the minimal time and effort - ram took 3 settings in bios - was a budget KLEVV kit, and PBO+ offset took like 3 tries and 2 reboots. Mobo has a "tighten secondaries" option set... $45-$50 air cooler, $110 used b650 mobo - it's the laziest "high end rig" build so far.
 
Last edited:
$45 air cooler - Thermalright Phantom Spirit EVO.

I used the same cooler setup with my 13700KF at 5.5 1.28V and it throttled at a cool 89-92C @230W to get ~9,600,000.

My top 13700kf score below 90C was at 220W cap 5.3 Ghz 1.22v (Voltage not accurate in hwinfo):
View attachment 372513

I have noticed benefits in CPU bound games at 4K - (Remnant 2, Space Marine etc.) Especially if you use DLSS to push 120+ fps w/ 4090. Remnant 2 game nights are for sure smoother as that game loses FPS as you play - by the end of the night it was a 70 FPS strugglefest. Once windows patches RL performance in 11 24H2 it should normalize a bit tho.
yeah like I mentioned somebody only buy's X3D when you're playing on 1080p or what your playing uses more CPU at the same time. I have finished Remnant 2 (on 4k as well) and I never had issues with inconsistent/dropping FPS (i played it with my 6900XT though) on this same machine (I play my 3A titles on PC and Emu and not-so hard ass games and some RPG's on my ROG Ally)

on an Air cooler..I see..haha..
 
yeah like I mentioned somebody only buy's X3D when you're playing on 1080p or what your playing uses more CPU at the same time. I have finished Remnant 2 (on 4k as well) and I never had issues with inconsistent/dropping FPS (i played it with my 6900XT though) on this same machine (I play my 3A titles on PC and Emu and not-so hard ass games and some RPG's on my ROG Ally)

on an Air cooler..I see..haha..
I honestly think it's windows patches... i played it before and never had issues - it for sure got worse this time around. Also it doesn't really happen on single player - but if you host and have multiplayer it leaks memory or something.
 
I honestly think it's windows patches... i played it before and never had issues - it for sure got worse this time around. Also it doesn't really happen on single player - but if you host and have multiplayer it leaks memory or something.
I customize my windows, I always grab latest ISO, and remove virtualization and VBS crap altogether with Defender off the ISO, what I noticed is the high RAM usage though aswell (Remnant II), though it was patched on a later update of the game, though it doesn't bother me as my 48GB is dedicated to gaming anyways.
 
Definitely not scorching hot.... would remount cooler.

Corona Benchmark Run - 79C with PBO +200 neg 30 offset
View attachment 372501
Coming from the 12900k (and 13/14900k before) it's 30C warmer at same power.

12900k was doing ~120w in CPU Z at 55c, this thing hits ~85.

It's also slower, but that's a given due to the core count I guess
 
He just blocked me over this



Now with that said, the argument that you don't need ram for the x3d chips is kinda whack. You get a 25% performance going from jedec to tuned. It's not as huge as with other platform (you can see 50+% gains) but it's still a lot.
It's a bit complicated because it depends on the title/game, but in most cases 6400 well tuned is better than 8000, you just need 8200. Of course if you're looking at the last few FPS numbers :D otherwise just - whatever. It's best if your IMC can hit 6600 :)
You get ( lets say) 25% when using non-x3D, and with x3D you will get lower percentages because most of the data is already in the cache. The x3D saves or adds latency, depending on where the data is in the cache or not.

Everything dips there. Intel included. Ill upload a video of it.

Also please don't link that grifter. He is running 4800mhz ram on the 9800x 3d....
He's right, can you reproduce it, that's another story.
Intel loads some things faster and that's not from today...
Does this have any impact on gameplay - not really, he's just using it for marketing. And in the end it can be tuned and bypassed.
 
Coming from the 12900k (and 13/14900k before) it's 30C warmer at same power.

12900k was doing ~120w in CPU Z at 55c, this thing hits ~85.

It's also slower, but that's a given due to the core count I guess
Oh yea at the same power it’s way hotter - the old X3D with the cache blanket hit 85C at like 75W.

Still 79C for me at 142W tho - maybe tune a pbo negative offset these chips seem to be happy between -20 and -40
 
So 6600 1:1 is no go, even with 1.3vsoc and 1.45 vddio - it casually hangs in windows. Doing 6400c28 currently stable with 2133 FLCK, ill try 6400c26 and 2200 FCLK and then start benching
 
So 6600 1:1 is no go, even with 1.3vsoc and 1.45 vddio - it casually hangs in windows. Doing 6400c28 currently stable with 2133 FLCK, ill try 6400c26 and 2200 FCLK and then start benching
They are rare, of course, and I mentioned it.
Here is one 24/7 result..
 
Is it because intel has more cores or is it because intel has some other thing amd lacks?
I doubt it's because of the cores, it's probably faster communication with RAM.
 
Ram speed is such a tiny part of the loading/streaming issue... you're loading like ~8GB of assets on systems doing 70-120GB/s - that's not your bottleneck -- not even close.

Most of it is due to the software and the chipset / PCI-e and how the game engine streams assets in from the SSD. Intel has led typically chipset maturity / performance and drivers.

Typically intel / wintel was the dominant platform so most software was optimized to it. When that isn't the case (Horizon Zero Dawn) you will see asset streaming much faster on AMD than intel with less texture pop-in.

AMD vs Intel CPUs for Game Loading, PCIe 4.0, 3.0, SATA SSDs and HDDs Tested - YouTube - not in game loading but gives an idea.

If i was to very loosely guess i would say software is 80% of the issue - with pcie-bus latency, infinity fabric, ram, io die all the other stuff being the remaining 20%.
 
Last edited:
Some preliminary results. It's a lot faster than my 12900k in this game but it also consumes a LOT more power as well - holy cow.

 
Some preliminary results. It's a lot faster than my 12900k in this game but it also consumes a LOT more power as well - holy cow.

Is it normal for it to be boosting to 5.4ghz all cores?All the benchmarks I've seen all have them boost to 5.2ghz.
Frametime seems ok'ish aswell.Especially when crossing that "gate"

Approve South Park GIF


I found a dude benching his 9800x3d and his frametime is practically stone cold flat.He only enabled expo in his bios.Everything else is default settings.
Wierdly enough all other benchmarks I've found on youtube have not so great frametime.I think they're either fake benchmarks or there's something up with their testing trig.
 
Last edited:
Frametime seems ok'ish aswell.

Approve South Park GIF


I found a dude benching his 9800x3d and his frametime is practically stone cold flat.He only enabled expo in his bios.Everything else is default settings.
Wierdly enough all other benchmarks I've found on youtube have not so great frametime.I think they're either fake benchmarks or there's something up with their testing trig.
Depends on the game, in warzone yeah - frametimes should be smooth no matter what you are running, it's a decently optimized game. KCD is exactly not that :roll:

Here is cyberpunk - heaviest of scenes. CPU is fast, but once you enable PBO oh god is it breathing fire

 
Depends on the game, in warzone yeah - frametimes should be smooth no matter what you are running, it's a decently optimized game. KCD is exactly not that :roll:

Here is cyberpunk - heaviest of scenes. CPU is fast, but once you enable PBO oh god is it breathing fire

83C wtf

What temps do you get without PBO and what cpu cooler are you using?
Also noticed you aren't using v-sync (your fps went to 800fps+ in menu) you would probably get a smoother frametime with v-sync enabled
 
83C horyshiet,

What temps do you get without PBO and what cpu cooler are you using?
Also noticed you aren't using v-sync (your fps went to 800fps+ in menu) you would probably get a smoother frametime with v-sync enabled
Im normally using 236 fps framecap from control panel but since im benching - in some games it can go way above 236 so im leaving it as is for now

Cooler is u12a, haven't tried stock yet
 
What offset are you running on the PBO? My PBO -30 is running almost same temps as stock.
 
Back
Top