• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Star Wars Jedi: Survivor Benchmark Test & Performance Analysis

Joined
Jul 11, 2015
Messages
804 (0.23/day)
System Name Harm's Rig's
Processor 5950X /2700x / AMD 8370e 4500
Motherboard ASUS DARK HERO / ASRock B550 Phantom Gaming 4
Cooling Arctic Liquid Freezer III 420 Push/Pull -6 Noctua NF-A14 i and 6 Noctua NF-A14 i Meshify 2 XL
Memory CORSAIR Vengeance RGB RT 32GB (4x16GB) DDR4 4266cl16 - Patriot Viper Steel DDR4 16GB (4x 8GB)
Video Card(s) ZOTAC AMP EXTREME AIRO 4090 / 1080 Ti /290X CFX
Storage SAMSUNG 980 PRO SSD 1TB/ WD DARK 770 2TB , Sabrent NVMe 512GB / 1 SSD 250GB / 1 HHD 3 TB
Display(s) Thermal Grizzly WireView / TCL 646 55 TV / 50 Xfinity Hisense A6 XUMO TV
Case Meshify 2 XL- TT 37 VIEW 200MM'S-ARTIC P14MAX
Audio Device(s) Sharp Aquos
Power Supply FSP Hydro PTM PRO 1200W ATX 3.0 PCI-E GEN-5 80 Plus Platinum - EVGA 1300G2/Corsair w750
Mouse G502
Keyboard G413
Vsync+VRR is best

If you cant stay close to your refresh rate, you may need Vsync off to trade stuttering for tearing
Vsync+VRR is best for sure , 48 /120 on my system with a 4090 , max settings raw, 60 to 80 fps game play .
 
Joined
Jul 11, 2022
Messages
375 (0.42/day)
Bottom line is 3D chips spank 13900K in gaming and use far less wattage and thus much less heat dumped into the case to interfere with precious RTX 4090 cooling.

And they are more reliable too.

I had nothing but random unexpected WHEA CPU Internal errors during shader compilations of Last of Us Part 1 with even a lower clocked 13900K e-waste cores disabled despite it passing every CPU and RAM/IMC stress test easily including OCCT, Y Cruncher, with flying colors. And it is so hit or miss. I had this issue on multiple 13900Ks e-waste cores disabled that were perfectly stable at 5.6GHz 5GHz ring and boom WHEA unexpected a coupled weeks down the road despite validating stability with lots of tough tests. I even tried another 13900KF lower clocked to 4.8GHz ring and 5.4GHz all core e-cores disabled and temps were much lower and I thought finally no more WHEA error would come as maybe the stability issues were random at higher speeds. But nope 3 weeks later despite 10C temps and power, WHEA Internal error was back when gaming LOU Shader compilation which made me give up on Intel for good and go with 7800X3D. The cooling for Intel needs to be much higher and the Raptor Lake chips are either inconsistent or degrade easily even when not worked that hard at only 160 watts. Certainly without extremely good liquid cooling anyways/.

I gave up on it and sold it and now use a 7800X3D and it is so much better and more reliable.

Yes it runs hotter relative to watt consumption, but overall system is much cooler as so much less heat dumped into the case due to less power

And Intel Raptor Lake CPUs still ran just as hot or hotter as their power consumption is so much higher despite being easier to cool relative to watt consumption. They still are a step back from Alder Lake due to same 10nm die size and Intel having to make P cores and e-cores smaller to squeeze in more e-waste core clusters on same 10nm die size of RL vs AL thus cooling per watt worse on Raptor than Alder.

7800X3D is so much better for gaming anyways without having to tweak and tweaking put so much stress on me and I was sick of it. So glad it is over with as I only game anyways and it is best for gaming. Not at all concerned anymore that is runs hotter at lower wattage and cannot keep up even close in productivity. And no games use more than 8 cores. Plus get platform upgrade path unlike on Intel.
 
Joined
Dec 12, 2012
Messages
777 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
And Intel Raptor Lake CPUs still ran just as hot or hotter as their power consumption is so much higher despite being easier to cool relative to watt consumption. They still are a step back from Alder Lake due to same 10nm die size and Intel having to make P cores and e-cores smaller to squeeze in more e-waste core clusters on same 10nm die size of RL vs AL thus cooling per watt worse on Raptor than Alder.

Your rant was interesting, but where did you get this idea? Full Raptor Lake die is 257 mm2, full Alder Lake die is 208 mm2.

You can't squeeze in more cores into the same die area on the same process, the die has to be bigger. And at the same clocks, Raptor is a bit more efficient the Alder.


Anyway, obviously the 7800X3D is faster and more efficient, there's no getting around that without more cache and being on an older node.
My friend bought it and we're building it tomorrow. It'll be cool to test an AMD platform first time in over a decade.
 
Joined
Jul 11, 2022
Messages
375 (0.42/day)
Your rant was interesting, but where did you get this idea? Full Raptor Lake die is 257 mm2, full Alder Lake die is 208 mm2.

You can't squeeze in more cores into the same die area on the same process, the die has to be bigger. And at the same clocks, Raptor is a bit more efficient the Alder.


Anyway, obviously the 7800X3D is faster and more efficient, there's no getting around that without more cache and being on an older node.
My friend bought it and we're building it tomorrow. It'll be cool to test an AMD platform first time in over a decade.


Well I read they are both 10nm node so I thought they were the same.

But I had nothing but random stability problems with Raptor Lake (CPU WHEA errors) despite Intel's reputation for being more stable than AMD. Yet the 7800X3D on the newer less mature AM5 platform has been great for me so far.

I also was able to cool Alder Lake better t similar power consumption than Raptor Lake though maybe it was just a placebo???

Maybe both archs struggle to be consistently stable much above 5GHz all P core.

I thought Raptor Lake would have been rock solid as it was supposedly a refinement and more mature Alder Lake that runs faster clocks I have been very disappointed as no where near the case. A random CPU related WHEA or 2 when running Cinebench down the road or game shader compilation despite my fixed ring and P al lP core clock speeds at static VCOE passing OCCT, Y Cruncher, Prime95, LinPack XTREME, and RealBench across multiple 13900K chips.

Never had those issues with a 5GHz all core 12700K Alder Lake at 4.8GHz ring. Yet at 5.4GHz 4.8GHz ring 13900KF at 1.225V LLC6 which passed even more tests and even did Shader compilation to make sure and it passed with flying colors with 10C lower temps than my prior higher VCORE and clocked 13900Ks I had. Then I used system normally for 3-4 weeks without any tough load and CPU power consumption never exceeded 90 watts. Then I decided I was going to start gaming thinking I finally had a stable lower clocked 13900K(KF) and after and running Last of Us Shader Compilation, my heart broke whe4n I saw a CPU Internal WHEA in HWInfo64 and screamed WTF!! I was teased for ultimate stability after passing even more tests. So I ran way and gave up on Raptor Lake all together as such.

Tweaking it is so hard and 7800X3D is much better for running at stock.

I liked Intel at first due to similar performance in gaming and I liked ability to do static clocks. Since that went out the window again even despite 1.225V at LLC6 a slight undervolt should have eaisly been enough for like underclocked 5.4GHz 4.8GGHz ring e-waste cores off. I mean to have to lower ring clock worse than Alder Lake. I was like WTF. And even then not sure. So enough with desiring static clocks (my own old school way of thinking) and just went with 7800X3D and much easier and happier now.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,536 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Bottom line is 3D chips spank 13900K in gaming and use far less wattage and thus much less heat dumped into the case to interfere with precious RTX 4090 cooling.

And they are more reliable too.

I had nothing but random unexpected WHEA CPU Internal errors during shader compilations of Last of Us Part 1 with even a lower clocked 13900K e-waste cores disabled despite it passing every CPU and RAM/IMC stress test easily including OCCT, Y Cruncher, with flying colors. And it is so hit or miss. I had this issue on multiple 13900Ks e-waste cores disabled that were perfectly stable at 5.6GHz 5GHz ring and boom WHEA unexpected a coupled weeks down the road despite validating stability with lots of tough tests. I even tried another 13900KF lower clocked to 4.8GHz ring and 5.4GHz all core e-cores disabled and temps were much lower and I thought finally no more WHEA error would come as maybe the stability issues were random at higher speeds. But nope 3 weeks later despite 10C temps and power, WHEA Internal error was back when gaming LOU Shader compilation which made me give up on Intel for good and go with 7800X3D. The cooling for Intel needs to be much higher and the Raptor Lake chips are either inconsistent or degrade easily even when not worked that hard at only 160 watts. Certainly without extremely good liquid cooling anyways/.

I gave up on it and sold it and now use a 7800X3D and it is so much better and more reliable.

Yes it runs hotter relative to watt consumption, but overall system is much cooler as so much less heat dumped into the case due to less power

And Intel Raptor Lake CPUs still ran just as hot or hotter as their power consumption is so much higher despite being easier to cool relative to watt consumption. They still are a step back from Alder Lake due to same 10nm die size and Intel having to make P cores and e-cores smaller to squeeze in more e-waste core clusters on same 10nm die size of RL vs AL thus cooling per watt worse on Raptor than Alder.

7800X3D is so much better for gaming anyways without having to tweak and tweaking put so much stress on me and I was sick of it. So glad it is over with as I only game anyways and it is best for gaming. Not at all concerned anymore that is runs hotter at lower wattage and cannot keep up even close in productivity. And no games use more than 8 cores. Plus get platform upgrade path unlike on Intel.
Pebkac
 
Joined
Jul 11, 2022
Messages
375 (0.42/day)
Edit delete.

Well anyways, Raptor Lake much harder to overclock underclock/manual tune CPU cores and ring frequency than Comet Lake and prior gens.

I have seen from experience Sandy Bridge to Comet Lake with a manual/overclock/underclock/voltage tuning, passing stress tests like OCCT, Prime95, Y Cruncher, LinPack XTREME and such guaranteed real world usage full stability in like every case.

Sadly not the case with multiple Raptor Lake chips that pass all those with flying colors then down the road Cinebench WHEA or BSOD or game shader compilation same type thing. And that is even if it passed with no issues at first, a week or a few later, boom a rnadom WHEA. And its not always consistent either. And a WHEA is bad as it means a BSOD could have resulted so thus not stable even if it did not crash that time.

So done with Raptor Lake as such despite me thinking it should have been as easy to manual clock tune as Coffee Lake and before.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,586 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
It might which is why at stock I would rather have 7800X3D. Intel chips were worth it to only tune manually not run at stock for me. Never and any interest at stock as that means e-cores and I hate those. Plus I like static clocks and uif I was going to give up on those, miswell go 7800X3D since it has no ability and is best gaming chip out there overall.
Maybe by disabling the e-cores, the CPU allocates all of its power allowance to the p-cores, and that's what gives you the errors? Or maybe your tune wasn't too good?

If you have no interest at stock settings, then the 7800X3D isn't your thing. It's pretty much a locked CPU - that's where its beauty lies. :)
 
Joined
Jun 14, 2020
Messages
3,536 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Edit delete.

Well anyways, Raptor Lake much harder to overclock underclock/manual tune CPU cores and ring frequency than Comet Lake and prior gens.

I have seen from experience Sandy Bridge to Comet Lake with a manual/overclock/underclock/voltage tuning, passing stress tests like OCCT, Prime95, Y Cruncher, LinPack XTREME and such guaranteed real world usage full stability in like every case.

Sadly not the case with multiple Raptor Lake chips that pass all those with flying colors then down the road Cinebench WHEA or BSOD or game shader compilation same type thing. And that is even if it passed with no issues at first, a week or a few later, boom a rnadom WHEA. And its not always consistent either. And a WHEA is bad as it means a BSOD could have resulted so thus not stable even if it did not crash that time.

So done with Raptor Lake as such despite me thinking it should have been as easy to manual clock tune as Coffee Lake and before.
Man, if you can't pass TLOU shader compilation then you were never stable to begin with. Have you like...tried doing it stock?
 
Joined
Jul 11, 2022
Messages
375 (0.42/day)
Maybe by disabling the e-cores, the CPU allocates all of its power allowance to the p-cores, and that's what gives you the errors? Or maybe your tune wasn't too good?

If you have no interest at stock settings, then the 7800X3D isn't your thing. It's pretty much a locked CPU - that's where its beauty lies. :)

Well not sure on you e-core comment but maybe its true which means the days of buying the Intel CPUs and using P cores only is over as somehow maybe the P cores got dependent on e-waste cores which is a horrible design and a shame if it is true. As I hate the e-cores and they suck. With Alder Lake it was easy to disable e-cores and use only P cores with no issues.

Maybe design has changed that somehow ruined that though its hard to believe, but rumors and rumblings have flown around at overclock.net from Falkentyne that the design was changed in such a way that turning off e-cores can cause weird single thread performance issues as somehow it was needed for allowing the ring to run at full speed with e-waste cores on?? Though that was only performance, not sure why it would cause WHEA errors, but who knows. Just speculation based on what I have experienced with random WHEA on RL chips

And my tuning should have been fine base don stress/stability tests I ran that passed easily multiple times if only the same methodology applied to current Intel chips that applied to Coffee Lake and prior which it appears it does not.

Yes at one time I did not like stock settings.

Now I do have interest at stock settings as overcl0ockinga nd manual tuning is so difficult now with todays gen CPUs unlike how easy it was with Coffee Lake and before.

So thus 78090X3D is is now for me as pretty much all stock except -20 CO which reduces tames a little and RAM DDR5 6000 XMP and Buildzoid tuned timings with more conservative numbers punched in that reduced memory latency from 69ns to 62ns. No static clocks though and I am now perfectly happy and ok with that.

Man, if you can't pass TLOU shader compilation then you were never stable to begin with. Have you like...tried doing it stock?


I actually passed it twice on a test at first.

Since I am OCD and like to do stability testing and then confirm it then do a fresh WIN install once confirmed free of any BSODs or WHEAs that I may have gotten during my tuning, I did and used PC for 3-4 weeks normally as I had no time for gaming during that time.

Then I had some time for gaming and decided to start playing TLOUD and during shader compilation no crash. Just to double check and make sure I am truly stable despite it being stable4 weeks ago with no errors, I wanted to make sure there was no WHEAs in HWINfo64. So I exited the game and my face blew up when I saw one or 2 CPU internal errors!!

I was able to pass Y Cruncher all testst multiple times, Linpack XTREME a bunch of times, OCCT Large Data variable a few tmes, Prime95 including Small FFTs with flying colors. And even TLOUS at first with flying colors.

Then a few weeks later boom WHEA during TLOU Shader compilation which got me very upset.

The prior 13900Ks I had clocked at 5.6 to 5.7GHz 5GHz ring, I omitted Prime95 Small FFTs AVX on and Y Cruncher SFT as those testst beyond torture the system so much there is no way it would not have thermally throttled. I thought well those are too tough and not real world anyways/ So maybe you were right there.

But the 5.4GHz config chip at only 4.8GHz ring was so much lower power usage I ran those tests and they passed with flying colors with only a peak temp of 91C and 210 watts of power which was the worst case scenario power virus like tests. So I should now be fully stable right at 5.4GHz since I did not omit those toughest insane tests.

And even TLOUD shader compilation to make sure at first and passed twice no issues.

until a few weeks later when I actually want to game and need to do it again boom WHEA despite peak temp of 83C on lower clocked chip which made me gave up.

So reality is these chips are much harder to get stable and stock is way to go. But in that case, I hate e-cores and wanted lower power, so sold it off and got 7800X3D before even trying that as only reason I got intel as cause I thought tuning manually would be easy like Coffee Lake, but turns out that is not the case as overclocking/underclocking/undervolting/manual tuning can cause CPU instability in things that even many tough stress/stability tests seem to not catch unlike Coffee Lake and before where it was easy.
 
Joined
Jan 14, 2019
Messages
12,586 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Well not sure on you e-core comment but maybe its true which means the days of buying the Intel CPUs and using P cores only is over as somehow maybe the P cores got dependent on e-waste cores which is a horrible design and a shame if it is true. As I hate the e-cores and they suck. With Alder Lake it was easy to disable e-cores and use only P cores with no issues.

Maybe design has changed that somehow ruined that though its hard to believe, but rumors and rumblings have flown around at overclock.net from Falkentyne that the design was changed in such a way that turning off e-cores can cause weird single thread performance issues as somehow it was needed for allowing the ring to run at full speed with e-waste cores on?? Though that was only performance, not sure why it would cause WHEA errors, but who knows. Just speculation based on what I have experienced with random WHEA on RL chips

And my tuning should have been fine base don stress/stability tests I ran that passed easily multiple times if only the same methodology applied to current Intel chips that applied to Coffee Lake and prior which it appears it does not.

Yes at one time I did not like stock settings.

Now I do have interest at stock settings as overcl0ockinga nd manual tuning is so difficult now with todays gen CPUs unlike how easy it was with Coffee Lake and before.

So thus 78090X3D is is now for me as pretty much all stock except -20 CO which reduces tames a little and RAM DDR5 6000 XMP and Buildzoid tuned timings with more conservative numbers punched in that reduced memory latency from 69ns to 62ns. No static clocks though and I am now perfectly happy and ok with that.
I see. Now the only part I don't understand is why you bought a 13900K if you're so much against e-cores. A 12500 would have been totally fine (albeit with 2 less p-cores).
 
Joined
Jun 14, 2020
Messages
3,536 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
This is a stock 12900k running TLOU. Disabling Ecores is dumb dumb, seeing how it's heavily using all 16 of p+e cores. Same fps as your "upgraded" 7800x 3d gets, right? :D
 
Joined
Jul 11, 2022
Messages
375 (0.42/day)
I see. Now the only part I don't understand is why you bought a 13900K if you're so much against e-cores. A 12500 would have been totally fine (albeit with 2 less p-cores).


Well I wanted the best binned P cores. I had went with a 13700K to try even lower clocks, but it sucked as it could not do 4.8GHz fully stable ring where as 12700K could and that was unacceptable. And its IMC sucked as well.

The higher binned ones have better IMCs. The 13900KF had a very good DDR5 IMC as it was one of the few that got 6800 DDR5 fully stable unlike the random Asus board DDR5 XMP instability WHEAs I had.

just once again non memory related CPOU internal WHEA comes back. And I was told at Overclock.net that CPU Internal errors are never memory or IMC and always CPU cores, voltage or ring related.

And yes I wanted 8 P cores, so 12500 was out and unfortunately Intel has no non e-waste core options with 8 P cores. While most games only use 6 cores and almost none use more than 8, 8 is better in the long run though no need for more.

This is a stock 12900k running TLOU. Disabling Ecores is dumb dumb, seeing how it's heavily using all 16 of p+e cores. Same fps as your "upgraded" 7800x 3d gets, right? :D


No it does not. Not when I have tested it. It uses maximum of 16 threads. 8 cores 16 threads at most. Run a later patch of game

I have seen countless examples of how it cannot even saturate a 7950X up to 50% which means it does not need more than 8 cores during game.

Shader compilation is temp and uses all cores it can take, but once it is done no need for more cores. And even then just takes a bit longer.

@Dragam1337

I agree with you that the e-cores are terrible and call them e-waste cores as well. They are only used because Intel cannot fit more than 8 good cores on their die size so they resort to gimmicks to keep up with AMD on heavily multi threaded workloads. AMD does not need to do it. They just need better single thread gaming performance overall. though they solcve that with the 3D chips which I see you also have which is great.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,536 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
No it does not. Not when I have tested it. It uses maximum of 16 threads. 8 cores 16 threads at most. Run a later patch of game

I have seen countless examples of how it cannot even saturate a 7950X up to 50% which means it does not need more than 8 cores during game.

Shader compilation is temp and uses all cores it can take, but once it is done no need for more cores. And even then just takes a bit longer.

@Dragam1337

I agree with you that the e-cores are terrible and call them e-waste cores as well. They are only used because Intel cannot fit more than 8 good cores on their die size so they resort to gimmicks to keep up with AMD on heavily multi threaded workloads. AMD does not need to do it. They just need better single thread gaming performance overall. though they solcve that with the 3D chips which I see you also have which is great.
Oh really?

 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Bottom line is 3D chips spank 13900K in gaming and use far less wattage and thus much less heat dumped into the case to interfere with precious RTX 4090 cooling.

And they are more reliable too.

I had nothing but random unexpected WHEA CPU Internal errors during shader compilations of Last of Us Part 1 with even a lower clocked 13900K e-waste cores disabled despite it passing every CPU and RAM/IMC stress test easily including OCCT, Y Cruncher, with flying colors. And it is so hit or miss. I had this issue on multiple 13900Ks e-waste cores disabled that were perfectly stable at 5.6GHz 5GHz ring and boom WHEA unexpected a coupled weeks down the road despite validating stability with lots of tough tests. I even tried another 13900KF lower clocked to 4.8GHz ring and 5.4GHz all core e-cores disabled and temps were much lower and I thought finally no more WHEA error would come as maybe the stability issues were random at higher speeds. But nope 3 weeks later despite 10C temps and power, WHEA Internal error was back when gaming LOU Shader compilation which made me give up on Intel for good and go with 7800X3D. The cooling for Intel needs to be much higher and the Raptor Lake chips are either inconsistent or degrade easily even when not worked that hard at only 160 watts. Certainly without extremely good liquid cooling anyways/.

I gave up on it and sold it and now use a 7800X3D and it is so much better and more reliable.

Yes it runs hotter relative to watt consumption, but overall system is much cooler as so much less heat dumped into the case due to less power

And Intel Raptor Lake CPUs still ran just as hot or hotter as their power consumption is so much higher despite being easier to cool relative to watt consumption. They still are a step back from Alder Lake due to same 10nm die size and Intel having to make P cores and e-cores smaller to squeeze in more e-waste core clusters on same 10nm die size of RL vs AL thus cooling per watt worse on Raptor than Alder.

7800X3D is so much better for gaming anyways without having to tweak and tweaking put so much stress on me and I was sick of it. So glad it is over with as I only game anyways and it is best for gaming. Not at all concerned anymore that is runs hotter at lower wattage and cannot keep up even close in productivity. And no games use more than 8 cores. Plus get platform upgrade path unlike on Intel.
WHEA errors on ryzen are usually related to the SoC/infinity fabric and leaving idle states - On intel it could be related to PCI-E corruption or memory controller voltages.

The more parts directly ran from the CPU, the more things can fail from a single thing the CPU doesnt like (like higher power draw from modern insanely fast RAM)
 
Joined
Jul 11, 2022
Messages
375 (0.42/day)
WHEA errors on ryzen are usually related to the SoC/infinity fabric and leaving idle states - On intel it could be related to PCI-E corruption or memory controller voltages.

The more parts directly ran from the CPU, the more things can fail from a single thing the CPU doesnt like (like higher power draw from modern insanely fast RAM)


I have had 0 on my 7800X3D. With multiple 13900Ks at both multiple DDR4 and one DDR5 setup at XMP RAM was rock stable no issues. But CPU WHEA errors which have nothing to do with RAM per Ichioru at overclock.net and other readings made me think degradation or inconsistency or Intel quality control is bad with these chips. The fact I passed all stress/stability tests multiple times with multiple 13900K chips then down the road random/intermittent BSOD/WHEA CPU Internal error after a CInebench run or shader compilations made me throw in towel on Intel.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
I have had 0 on my 7800X3D. With multiple 13900Ks at both multiple DDR4 and one DDR5 setup at XMP RAM was rock stable no issues. But CPU WHEA errors which have nothing to do with RAM per Ichioru at overclock.net and other readings made me think degradation or inconsistency or Intel quality control is bad with these chips. The fact I passed all stress/stability tests multiple times with multiple 13900K chips then down the road random/intermittent BSOD/WHEA CPU Internal error after a CInebench run or shader compilations made me throw in towel on Intel.
I can only speak for AM4 and AM5 on these errors, but people make statements like that and the meaning changes completely without context or deeper understanding of the subject.
On yzen higher memory speeds raises infinity fabric - and the higher infinity fabric causes the WHEA errors.

It's true that it's not a RAM error, but it's also true that high RAM speed causes the WHEA error.
Tiny changes to wording make those sound impossible to both be true.


Theres a thread on WD drives causing WHEA errors and how to read the error logs to identify the device causing the problem
 
Top