• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

9800x 3d vs 12900k - Battle of the Century

Joined
Nov 16, 2023
Messages
1,435 (3.63/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
That's not true either. Memory latency makes a huge difference. The thing is that in this particular instance with the x3d it's just the Aida reading that's off, not the memory latency itself.

The reason my 12900k is only 15 - 20% off from the 9800x 3d is because the 47ns of latency compared to 60+ you'll get with xmp.
Sorry was at work, so couldn't reply right away. Busy day.

15 to 20% what? FPS?? Nah brother, I need way more data than this.

"tune memory" start with 7zip on benchmate.
If you improve the score tweaking only timings and nothing else, you'll have a fast rig :)
 
Last edited:
Joined
Jun 14, 2020
Messages
3,530 (2.15/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Yeah but you're on immature bioses; the 12900K took like a year to get the latency down when it launched since even 5600 ram was impossible to get a hold of - the average latency people were getting was 65-75ns, the first boards/versions with lowest latency were DDR4 and the initial MSI bioses didn't hit anywhere near 47ns -- it took a while and new bioses and dimms to get there. My 7600mhz kit didn't get a proper bios on until like June of 2023 that let me break below 55ns.

And AIDA is self-admittedly buggy -- they will likely release new AIDA version and you will 'miraculously' be sitting at 52ns on the AM5 system with no change in fps in games.
Not particularly true, I had mine since november of 2021, but yeah I had to wait 3 months for ddr5 kits - but latency on early bioses were fine at least on mine. You had a 13700k which has higher latency (more cache) than the 12900k anyways, you can't get 47ns on raptors (with sane voltages that is).

Sorry was at work, so couldn't reply right away. Busy day.

15 to 20% what? FPS?? Nah brother, I need way more data than this.

"tune memory" start with 7zip on benchmate.
If you improve the score tweaking only timings and nothing else, you'll have a fast rig :)
I've seen a ~30% improvement on R&C for example going from 6000c36 XMP to 7000c30 tuned on the 12900k. That's by tweaking memory alone. Not all games scale like that but most of them can get a 15% improvement at the very least.

Funny you mentioned 7zip, well that's exactly what R&C (and all insomniac games) do - they unzip assets on the fly while you are playing the game, which is why memory helps there.
 
Joined
Nov 11, 2016
Messages
3,439 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Not particularly true, I had mine since november of 2021, but yeah I had to wait 3 months for ddr5 kits - but latency on early bioses were fine at least on mine. You had a 13700k which has higher latency (more cache) than the 12900k anyways, you can't get 47ns on raptors (with sane voltages that is).


I've seen a ~30% improvement on R&C for example going from 6000c36 XMP to 7000c30 tuned on the 12900k. That's by tweaking memory alone. Not all games scale like that but most of them can get a 15% improvement at the very least.

Funny you mentioned 7zip, well that's exactly what R&C (and all insomniac games) do - they unzip assets on the fly while you are playing the game, which is why memory helps there.

I went from 6000 cas36 to 7200cas34 with Buildzoid tuned timings on my 13700KF and found negligible difference in PUBG, then come the 9800X3D with 6000cas32 with XMP destroy the 13700KF by around 35% in PUBG.

For single player games I would alway use Ray Tracing so CPU don't matter much
 
Joined
Aug 9, 2019
Messages
1,717 (0.88/day)
Processor 7800X3D 2x16GB CO
Motherboard Asrock B650m HDV
Cooling Peerless Assassin SE
Memory 2x16GB DR A-die@6000c30 tuned
Video Card(s) Asus 4070 dual OC 2610@915mv
Storage WD blue 1TB nvme
Display(s) Lenovo G24-10 144Hz
Case Corsair D4000 Airflow
Power Supply EVGA GQ 650W
Software Windows 10 home 64
Benchmark Scores Superposition 8k 5267 Aida64 58.5ns
Not particularly true, I had mine since november of 2021, but yeah I had to wait 3 months for ddr5 kits - but latency on early bioses were fine at least on mine. You had a 13700k which has higher latency (more cache) than the 12900k anyways, you can't get 47ns on raptors (with sane voltages that is).


I've seen a ~30% improvement on R&C for example going from 6000c36 XMP to 7000c30 tuned on the 12900k. That's by tweaking memory alone. Not all games scale like that but most of them can get a 15% improvement at the very least.

Funny you mentioned 7zip, well that's exactly what R&C (and all insomniac games) do - they unzip assets on the fly while you are playing the game, which is why memory helps there.
My 12400F bought in february 2021 was crap on stability, latency, everything for 4months until the 5 bios update dropped I struggled getting 3500 G1 stable at first, 3700 G1 worked after bios 5.
 
Joined
Sep 20, 2021
Messages
467 (0.40/day)
Processor Ryzen 7 9700x
Motherboard Asrock B650E PG Riptide WiFi
Cooling Underfloor CPU cooling
Memory 2x32GB 6200MT/s
Video Card(s) 4080 SUPER Noctua OC Edition
Storage Kingston Fury Renegade 1TB, Seagate Exos 12TB
Display(s) MSI Optix MAG301RF 2560x1080@200Hz
Case Phanteks Enthoo Pro
Power Supply NZXT C850 850W Gold
Mouse Bloody W95 Max Naraka
If AIDA is the outlier… it IS the problem.
The partners said there was a latency problem after a bios update, and then they added bios that fixed those, but you think the problem is in AIDA?
OK :)
 
Joined
Apr 14, 2018
Messages
689 (0.28/day)
The partners said there was a latency problem after a bios update, and then they added bios that fixed those, but you think the problem is in AIDA?
OK :)

I mean even the AMD engineer states it’s an “artifact of AIDA”. Other latency comparison programs show the same or near identical latency between all the other modes.

You can’t lead a brick to water though.
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
8,741 (3.83/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
Ada does all kinds of weird stuff.

To be taken with a grain of salt..

Fun toy to play with, but not worth the asking price.
 
Joined
Sep 20, 2021
Messages
467 (0.40/day)
Processor Ryzen 7 9700x
Motherboard Asrock B650E PG Riptide WiFi
Cooling Underfloor CPU cooling
Memory 2x32GB 6200MT/s
Video Card(s) 4080 SUPER Noctua OC Edition
Storage Kingston Fury Renegade 1TB, Seagate Exos 12TB
Display(s) MSI Optix MAG301RF 2560x1080@200Hz
Case Phanteks Enthoo Pro
Power Supply NZXT C850 850W Gold
Mouse Bloody W95 Max Naraka
I mean even the AMD engineer states it’s an “artifact of AIDA”. Other latency comparison programs show the same or near identical latency between all the other modes.

You can’t lead a brick to water though.
That's interesting, show me where they said it please :)
 
Joined
Nov 16, 2023
Messages
1,435 (3.63/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
Not particularly true, I had mine since november of 2021, but yeah I had to wait 3 months for ddr5 kits - but latency on early bioses were fine at least on mine. You had a 13700k which has higher latency (more cache) than the 12900k anyways, you can't get 47ns on raptors (with sane voltages that is).


I've seen a ~30% improvement on R&C for example going from 6000c36 XMP to 7000c30 tuned on the 12900k. That's by tweaking memory alone. Not all games scale like that but most of them can get a 15% improvement at the very least.

Funny you mentioned 7zip, well that's exactly what R&C (and all insomniac games) do - they unzip assets on the fly while you are playing the game, which is why memory helps there.
Makes sense, I did not know insomniac games did that! Learn something new every day!!
 
Joined
Apr 14, 2018
Messages
689 (0.28/day)
That's interesting, show me where they said it please :)

I shall spoon feed the brick…

IMG_6353.jpeg
 
Joined
Nov 16, 2023
Messages
1,435 (3.63/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
Ada does all kinds of weird stuff.

To be taken with a grain of salt..

Fun toy to play with, but not worth the asking price.
Everything is to be taken with salt until replicated with accuracy repeatedly. (Averages)
I love windows tools for tweaking, but understand they also need to use resources.
Measuring tools use resources, so benchmarking, you don't measure anything.
Then you don't have apps degrading performance unnecessarily in the background.

By the way homie, a 9800X3D completely smashes my 14700K 6.1ghz in 7-Zip. ;)
 

Degreco

New Member
Joined
Aug 10, 2024
Messages
15 (0.12/day)
makes perfect sense -- look at 5800x3d reviews from 2 years ago with a 3090 and now compare them to games running today with 4080/4090. Also the 9800X3D will annihilate a 7950X with SMT off in gaming in anything that isn't completely GPU bound.

Ryzen 7 9800X3D, Really Faster For Real-World 4K Gaming?
2022:
View attachment 371453

2024:
View attachment 371455
A red lamp should come up with a 3 game avg test series with 'balanced settings'!

I do not believe that you will see this differences with a 4090 at 4k in reality. That's coming from a 4090 and a 5800x3d. A 3 game Avg is not gonna cut it, sorry.
Steve from HU/Techspot is also known to carefully select games and settings that drive his argumentation home. Also, his 'ultra' is not Techpowerups Ultra or what most gamers know as ultra. In no way, and I say this with all my experience, will you ever see a rough 25% bump in avg(!) fps between 5800x and 5800x3d in 4k with a 4090. We could debate min fps but not avg fps. If anything, those two Techspot charts only show how the reviewer shapes his desired message into them.

Other sites like Techpowerup or GamersNexus report nearly identical avg 4k figures with the 4090 for several CPUs for a series of dozen games like CP2077, Starfield, The last of us, RDR2 and so on. There are outliers that scale in 4k, but in a 50 game average they will be buried in the avg result. Of course, if one throws in e.g. Spiderman and Hogwarts Legacy (weird engines that do not behave like the majority of engines) into a 3 Game selection and uses undisclosed 'balanced settings' while having an misleading 'Ultra' title, one can achieve those results.

Just my two cents on those charts.
 
Last edited:

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
8,741 (3.83/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
Joined
Nov 13, 2007
Messages
10,818 (1.73/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
I do not believe that you will see this differences with a 4090 at 4k in reality. That's coming from a 4090 and a 5800x3d. A 3 game Avg is not gonna cut it, sorry.
Steve from HU/Techspot is also known to carefully select games and settings that drive his argumentation home. Also, his 'ultra' is not Techpowerups Ultra or what most gamers know as ultra. In no way, and I say this with all my experience, will you ever see a rough 25% bump in avg(!) fps between 5800x and 5800x3d in 4k with a 4090. We could debate min fps but not avg fps. If anything, those two Techspot charts only show how the reviewer shapes his desired message into them.

Other sites like Techpowerup or GamersNexus report nearly identical avg 4k figures with the 4090 for several CPUs for a series of dozen games like CP2077, Starfield, The last of us, RDR2 and so on. There are outliers that scale in 4k, but in a 50 game average they will be buried in the avg result. Of course, if one throws in e.g. Spiderman and Hogwarts Legacy (weird engines that do not behave like the majority of engines) into a 3 Game selection and uses undisclosed 'balanced settings' while having an misleading 'Ultra' title, one can achieve those results.

Just my two cents on those charts.
I went from a 13700KF at 5.5 Ghz to 9800X3d @5.4ghz and at 4K I can definitely tell a difference in remnant 2, which is the game I've been nolifing with friends when i made the switch - especially on the LOW fps dips.

This is the same argument for 5800x vs 5800x3d back in the day. At 4k native they were the same. Today, there is a 5-10% difference between these two cpus at 4k on average, including games that don't care about CPU, but a 30%+ difference in games that do or if you use DLSS or FSR.

At the end of the day it's a faster CPU -- you can disable DLSS/push ULTRA settings to get the most bottleneck on GPU and then declare that all these CPUs are the same, but most people that game at 4k don't do this - they're all running 144hz+ monitors and they basically shoot for 100+ fps by tweaking settings, which means running FSR/DLSS almost all the time (also upscaling actually looks great at 4k).
 
Last edited:
Joined
Oct 19, 2022
Messages
114 (0.14/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Samsung 990 PRO 2TB w/ Heatsink SSD + Seagate FireCuda 530 SSD 2TB w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz monitor (+ LG OLED C9 55" TV 4K@120Hz)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q with AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
A red lamp should come up with a 3 game avg test series with 'balanced settings'!

I do not believe that you will see this differences with a 4090 at 4k in reality. That's coming from a 4090 and a 5800x3d. A 3 game Avg is not gonna cut it, sorry.
Steve from HU/Techspot is also known to carefully select games and settings that drive his argumentation home. Also, his 'ultra' is not Techpowerups Ultra or what most gamers know as ultra. In no way, and I say this with all my experience, will you ever see a rough 25% bump in avg(!) fps between 5800x and 5800x3d in 4k with a 4090. We could debate min fps but not avg fps. If anything, those two Techspot charts only show how the reviewer shapes his desired message into them.

Other sites like Techpowerup or GamersNexus report nearly identical avg 4k figures with the 4090 for several CPUs for a series of dozen games like CP2077, Starfield, The last of us, RDR2 and so on. There are outliers that scale in 4k, but in a 50 game average they will be buried in the avg result. Of course, if one throws in e.g. Spiderman and Hogwarts Legacy (weird engines that do not behave like the majority of engines) into a 3 Game selection and uses undisclosed 'balanced settings' while having an misleading 'Ultra' title, one can achieve those results.

Just my two cents on those charts.
I approve, I have a 4090 and just upgraded from a 5900X with very tight timings (4x8GB @ 3733MHZ 14-14-14-28) to a 9800X3D (2x32GB 6400MHz 30-38-38-30) and the difference at Native 4K is almost insignificant unless the game is very CPU bound like Spider-Man, Far Cry 6, etc. but it's not night and day either, just a few fps more with the 9800X3D (mostly better 1% and 0.1% Lows)

If you're playing Assetto Corsa and some 1-Threaded games then yeah the X3D will definitely make a difference but most modern games do not run like that, since most games are also on Consoles and are optimized for 8c/16t CPUs.

Fast CPU <=> RAM communication, as you know, is very important.
My AIDA64 result jumped from ~56 to ~66ns, so:

FPS decreases.
Anything related to latency will run slower.

If you do a Google search, you'll read a lot about it :)
My 9800X3D was having 80.1ns with the default XMP profile of my G.SKILL Trident Z Royal 2x32GB 6400MHz CL32 :( after tuning the RAM I was able to achieve 68.1ns and with MSI Latency Killer I'm now at 61.1ns :D
 

Degreco

New Member
Joined
Aug 10, 2024
Messages
15 (0.12/day)
and the difference at Native 4K is almost insignificant
Yes, that's also my experience with the 4090 in several rigs.
I went from a 13700KF at 5.5 Ghz to 9800X3d @5.4ghz and at 4K I can definitely tell a difference in remnant 2
So, what are you saying? Let's ditch the 3 game avg of Techspot and concentrate on one game? Even better, one game with a spotless realization of the UE engine, that behaves so well, it can stand as a shiny example of any CPU gains you get in 1440p 4k?

I remember the controversy when this game came out:
they-designed-the-game-with-upscale-in-mind-here-it-is-new-v0-dnm9k1udondb1.png

So, Remnant2 is not a good example for a 1 game 4k test.
 
Joined
Oct 19, 2022
Messages
114 (0.14/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Samsung 990 PRO 2TB w/ Heatsink SSD + Seagate FireCuda 530 SSD 2TB w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz monitor (+ LG OLED C9 55" TV 4K@120Hz)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q with AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
So, what are you saying? Let's ditch the 3 game avg of Techspot and concentrate on one game? Even better, one game with a spotless realization of the UE engine, that behaves so well, it can stand as a shiny example of any CPU gains you get in 1440p 4k?
Yeah maybe there's this one game and in a specific area or something... like my 5900X would bottleneck my 4090 at 50% while facing the train and the fire pit with the old lady and the young girl in Metro Exodus: Enhanced Edition but other than that I would never have some huge bottlenecks. It's exceptional
 
Joined
Nov 13, 2007
Messages
10,818 (1.73/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Yes, that's also my experience with the 4090 in several rigs.

So, what are you saying? Let's ditch the 3 game avg of Techspot and concentrate on one game? Even better, one game with a spotless realization of the UE engine, that behaves so well, it can stand as a shiny example of any CPU gains you get in 1440p 4k?

I remember the controversy when this game came out:
View attachment 375629
So, Remnant2 is not a good example for a 1 game 4k test.
It's additive so there's actually 4 games for you. Point is life is short - play games, if there is a game that's struggling to feed your GPU then upgrade your CPU. Simple stuff -- anything below 80% for me is struggling and not feeding the GPU enough so I upgrade.

Yeah maybe there's this one game and in a specific area or something... like my 5900X would bottleneck my 4090 at 50% while facing the train and the fire pit with the old lady and the young girl in Metro Exodus: Enhanced Edition but other than that I would never have some huge bottlenecks. It's exceptional
I mean that's fine, enjoy your 75% gpu usages on your 4090s... im going to go play Stalker 2 now at 15% higher FPS and 25% better lows and mins at 4K with DLSS. :toast:


The argument that X game isn't optimized is just proving my point... almost none of these games on release are optimized. Dragon Age Veilguard and DOOM eternal, everything else has all sorts of jank. People's experience of the game is based on lows and stutters -- it doesn't matter if 90% of the time you're getting almost the same FPS -- that 10% where it dips is what you will notice.
 
Last edited:
Joined
Oct 19, 2022
Messages
114 (0.14/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Samsung 990 PRO 2TB w/ Heatsink SSD + Seagate FireCuda 530 SSD 2TB w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz monitor (+ LG OLED C9 55" TV 4K@120Hz)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q with AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
It's additive so there's actually 4 games for you. Point is life is short - play games, if there is a game that's struggling to feed your GPU then upgrade your CPU. Simple stuff -- anything below 80% for me is struggling and not feeding the GPU enough so I upgrade.


I mean that's fine, enjoy your 75% gpu usages on your 4090s... im going to go play Stalker 2 now at 15% higher FPS and 25% better lows and mins at 4K with DLSS. :toast:


The argument that X game isn't optimized is just proving my point... almost none of these games on release are optimized. Dragon Age Veilguard and DOOM eternal, everything else has all sorts of jank. People's experience of the game is based on lows and stutters -- it doesn't matter if 90% of the time you're getting almost the same FPS -- that 10% where it dips is what you will notice.
Most games are poorly optimized nowadays, that's why I usually wait 1 or 2 months to play them. But some studios like Nixxes launch games that are fully playable from the get go (even if they can have a few bugs here and there but nothing bad).
The problem with newer games (mostly UE5 ones) is that we experience stutters and/or huge framerate drops that can definitely kill the immersion & fun.

4K DLSS is cool but 4K DLAA looks much sharper/cleaner! That's where high-end GPUs matter ;)
 
Top