• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

I switched from LGA1700 to AM5, here are my thoughts

Joined
Jul 29, 2023
Messages
46 (0.12/day)
System Name Shirakami
Processor 7800X3D 80C PBO Level 5 @ -35 / 1.17 SoC / 2100 IF clock / 1:1 memory
Motherboard Gigabyte B650i AORUS Ultra
Cooling Corsair iCUE H100i ELITE CAPELLIX w/ ultra slim 2x120mm fans
Memory 2x24GB Hynix M-Die @ 6000 (28-36-28-30-28) - 65k tREFI 1.45 VDD / 1.20 VDDQ
Video Card(s) 6900 XT Reference
Storage 1 & 2 TB NVMe - 1 TB SATA SSD
Display(s) LG 34GN850 (3440x1440) 160 Hz overclock
Case Lian Li Q58
VR HMD Reverb G2 V2
Software Fedora Linux 40 operating system (daily) - Win10 (Virtual Reality stuff)
Earlier this week, I made the jump from my 13700k to a 7800X3D. Oddly enough, I didn't do this because of degradation or the current Intel debacle. Rather, I switched because of power draw and thread scheduling. My system was completely stable in the one year I owned it. My specifications were as follows,

13700k - ASUS Z690 ITX - DDR5 7200 2x24GB - 6900 XT

My new specifications are

7800X3D - Gigabyte B650i AORUS ULTRA - 7200 2x24GB @ 6000 - 6900 XT

My Intel system was fairly tuned (~10% more performance, mostly from the memory and ring) and the 7800X3D is.. as tuned as you can make it (~7% more performance).

To begin with, I loved my 13700k. I found it to be a wonderful CPU and a nice upgrade from a 10900k. Unfortunately, the power draw was quite immense in modern games - Cyberpunk 2077 averaged around 140 watts with spikes upwards of 165 on occasion (Hwinfo64 CPU package power). I tried mitigating this by disabling Hyperthreading, but the power draw only lowered to about 125 with spikes up to 150. I also tried disabling efficiency cores (HT still enabled), the power draw lowered to around 120 with spikes up to 140. Disabling both HT and e-cores resulted in the most substantial loss of power, going all the way down to 115 watts with no real spikes to speak of. The performance difference between all three setups was less than 10%, which is one of the main driving factors that pushed me towards changing platforms. If it's pulling upwards 140 watts with current games, I can't even imagine what the CPU will draw in five years from now.. and electricity isn't getting any cheaper.

Disabling HT feels bad, even if it's not a huge driving factor to gaming performance. Disabling the e-cores feels even worse, considering those are tangible cores unlike HT, where more instructions being shoved into the transistors on each clock cycle. There's also the argument that disabling either of those features will hurt gaming performance in the future - leaving you with only one option. Underclocking. You can set a flat 5 GHz p-cores and 3.9 GHz e-cores at a fairly reasonable voltage; the power consumption gets cut by roughly 25% in gaming scenarios.. but you also completely lose out on Raptor Lake being a monstrosity when it comes to clockspeed. It feels just as bad as the above concessions. There is also the issue of thread scheduling; I do not run Windows 11, I daily Linux which didn't get proper support for Raptor Lake until Kernel 6.6 in October 2023. I use the term support loosely, because in gaming scenarios, the e-cores still become loaded with instructions. Windows 10 was and is my go-to for whenever I need to dual boot, however, the thread scheduler in Windows 10 isn't anywhere near as nice when compared to Windows 11. It's a similar issue to Linux wherein the efficiency cores are loaded with instructions, causing noticeable microstutters in gaming situations.

Add all of the issues I've listed above, including the poor memory controller found on 13th and 14th generation - with stability drifting on and off when aiming for speeds beyond 7200 MT/s - alongside the IHS bending, requiring a third-party loading mechanism (which I purchased) - not to mention the current degradation due to the SVID requesting upwards of 1.65v ..

Needless to say, after over a year of ownership, I was fairly sick of it all.

The power was a problem I needed to solve. I undervolted + overclocked the system and set a strict power limit of 175W to assure AVX2 workloads wouldn't cause crashing. It only needed to be stable for games, not heavy workloads, therefore the voltage I aimed for was whatever could run Cyberpunk 2077 at a CPU bind.

The scheduling was an issue I needed to solve, so I purchased Process Lasso for when I use W10 - which I highly recommend, it's a wonderful application.

The bending was an issue I needed to solve, so I purchased the ThermalTake loading mechanism.

The lack of memory stability at XMP speeds (7200) was an issue I needed to solve, which took weeks of tuning PLL voltages, VDD & VDDQ, IMC voltages, SA voltages.. etc.

The heat itself was an issue I needed to solve; I was highly uncomfortable with my CPU pulling 140 watts in games only to thermal throttle during loadscreens since the radiator, IHS, and liquid were all saturated. (240mm AIO).

The only saving grace of my build was the ASUS Z690 Gaming Wifi ITX motherboard. I absolutely adored that motherboard, it was overengineered and the VRM's didn't ever rise past 65C in stress tests.. and the single threaded performance. Raptor Lake has absolutely insane ST performance. My 7800X3D achieves 655 points in CPU-Z single threaded, while the 13700k is upwards of 900.

Admittedly, as of writing this, I haven't been on AM5 long enough to have experienced SoC voltages murdering AM5 CPU's outright. I also wasn't around for when memory training took upwards of 5 minutes. And I wasn't here for when core parking was a problem either. However, in the here-and-now, I'm absolutely in-love with the ease-of-use of my 7800X3D. It performs about the same, sometimes better, than my 13700k did in gaming situations.. while drawing 40 watts. If I were recommending a system to a friend today, I couldn't in good conscience recommend LGA1700 - the experience I had with it was quite negative, even if the CPU itself was blistering fast.

I hope that whatever Intel releases for their next generation is more refined. Competition is a very, very good thing - without it, we wouldn't even have an alternative like the 7800X3D to choose from. I love both teams. I even purchased a 1700X the first day it came out and dealt with the fiasco that was X370 BIOS' being very buggy. I loved my 10900k as well, it was a wonderful CPU but made very obsolete when 12th generation released - considering the e-cores were almost on par in ST performance. The 3570k I had before that was rock-solid, I never had any issues with it whatsoever. But this entire LGA1700 has just been a mess for me personally; in retrospect I wish that I had just purchased a 12900k in place of the 13700k. At least on Alder Lake, disabling the efficiency cores granted you a boost to cache speed - so it wasn't a complete loss.

But as things stand, I'll be holding on this chip for a while. I should note the 7800X3D did not exist at the time of me purchasing my 13700k.
 
Joined
Feb 10, 2007
Messages
2,630 (0.41/day)
Location
Oulu, Finland
System Name Enslaver :)
Processor Ryzen 7 7800X3D
Motherboard ASUS TUF Gaming B650-Plus
Cooling CPU: Noctua NH-D14 with LED fans, Case: 2 front in - 1 rear out
Memory 2x16GB Kingston Fury Beast RGB 6000MHz
Video Card(s) ASUS TUF RTX 4070Ti OC
Storage Samsung Evo Plus 1TB NVMe , internal WD Red 4TB for storage, WD Book 8TB
Display(s) LG CX OLED 65"
Case Lian Li LANCOOL II Mesh C Performance
Audio Device(s) HDMI audio powering Dolby Digital audio on 5.1 Z960 speaker system
Power Supply Corsair RM850x
Mouse Logitech G700
Keyboard ASUS Strix Tactic Pro
Software Windows 11 Pro x64
really 200w intel vs 80w amd makes a difference in power bill total
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,776 (1.97/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus Block, HWLABS Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 2x A4x10, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+, TOFU-R CNC Alu/Brass, SS Prismcaps W, Jellykey, Lube/Mod, TLabs Leath/Suede Wristrest
Software Windows 10 IoT Enterprise LTSC 19044.4046
Benchmark Scores Legendary
really 200w intel vs 80w amd makes a difference in power bill total
Plus an extra 25 W at all times when not under load.

1724327515467.png

If you're so concerned about power draw why are you using an AMD GPU?

I'm also a little confused as to -
Evelynn said:
the poor memory controller found on 13th and 14th generation
Compared to what? AMD's setup that can't go past 6400? Or more realistically, 6200 MT. Raptor Lake can do 7200 on any four DIMM board, 7600 on any two DIMM board, and with some effort, 8000+. Karhu stable.

This results in ~125 GBs/read/write/copy with ns around 45-50 compared to Zen 4/5 ~70-80 GB/s @ 55 ns.

As for the degradation issues, it's been patched, if that's still not enough, simply set voltages manually, presto, problem solved. If that's still not enough, Intel extended the warranty by two years on top of the base warranty.

Anyway, it's an odd time to sidegrade to the competing platform, with Zen 5 X3D around the corner along with Arrow Lake.
 
Joined
Jul 29, 2023
Messages
46 (0.12/day)
System Name Shirakami
Processor 7800X3D 80C PBO Level 5 @ -35 / 1.17 SoC / 2100 IF clock / 1:1 memory
Motherboard Gigabyte B650i AORUS Ultra
Cooling Corsair iCUE H100i ELITE CAPELLIX w/ ultra slim 2x120mm fans
Memory 2x24GB Hynix M-Die @ 6000 (28-36-28-30-28) - 65k tREFI 1.45 VDD / 1.20 VDDQ
Video Card(s) 6900 XT Reference
Storage 1 & 2 TB NVMe - 1 TB SATA SSD
Display(s) LG 34GN850 (3440x1440) 160 Hz overclock
Case Lian Li Q58
VR HMD Reverb G2 V2
Software Fedora Linux 40 operating system (daily) - Win10 (Virtual Reality stuff)
Plus an extra 25 W at all times when not under load.

View attachment 360088
If you're so concerned about power draw why are you using an AMD GPU?

I'm also a little confused as to -

Compared to what? AMD's setup that can't go past 6400? Or more realistically, 6200 MT. Raptor Lake can do 7200 on any four DIMM board, 7600 on any two DIMM board, and with some effort, 8000+. Karhu stable.

This results in ~125 GBs/read/write/copy with ns around 45-50 compared to Zen 4/5 ~70-80 GB/s @ 55 ns.

As for the degradation issues, it's been patched, if that's still not enough, simply set voltages manually, presto, problem solved. If that's still not enough, Intel extended the warranty by two years on top of the base warranty.

Anyway, it's an odd time to sidegrade to the competing platform, with Zen 5 X3D around the corner along with Arrow Lake.

I'm not too concerned with idle voltage, since I typically have an application open in the background that forces my cores to be at ~15% utilization at all times.

I use an AMD GPU because it's 16 GB and the RTX 3080 it replaced was 10GB - which caused me to run out of VRAM in Virtual Reality games very, very often - and still would, if I hadn't switched. It's also undervolted and underclocked to pull only 180w.

I didn't say the AMD IMC was better, it's definitely not. The 7800X3D has one CCX, resulting in a nominal speed of 64 GB/s read - compared to the 110+ of my 13700k.

You know full well that degradation can't be patched out. What Intel did is forced the SVID to be at board vendor's Fail Safe modes and implemented amperage limits across the lineup. Your CPU won't crash if it's degraded because it's going to throttle on amperage first.

Weird and overly aggressive comment to my recollection of the last year of dealing with LGA 1700's problems but you do you.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,776 (1.97/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus Block, HWLABS Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 2x A4x10, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+, TOFU-R CNC Alu/Brass, SS Prismcaps W, Jellykey, Lube/Mod, TLabs Leath/Suede Wristrest
Software Windows 10 IoT Enterprise LTSC 19044.4046
Benchmark Scores Legendary
Doing an entire platform sidegrade to save 56 watts under gaming load while adding 25 watts on idle.

Uneasy about undervolting/disabling HT/disabling E cores (or using Win 10 for gaming and Process Lasso, or Win 11 for the scheduler) since ~10% performance lost. But then running a 3-600 W GPU at 180 W to save power (also resulting in a performance loss).

1724328204771.png


This entire concept is... confusing.

Nothing about my comment is aggressive, it's simply confused questioning instead of the praise you may have expected.

1724328025297.png


Note these CPU gaming tests are done with a 4090, so they represent worst case scenario for gaming power draw (when paired with fastest GPU).
 
Joined
May 20, 2020
Messages
190 (0.12/day)
Location
WorldWideWeb
System Name MPG
Processor i3-10105F
Motherboard Z490 Gaming Plus (MS-7C75)
Cooling Be Quiet Dark Rock Advanced
Memory GSkill RipJawsV 32Gb 3200
Video Card(s) GTX1060 6Gb AERO ITX OC
Storage NVMW WD Black SN750 Samsung EVO 840 SSD and Mechanical TBs...
Display(s) Asus VS278Q
Case CM HAF-X
Audio Device(s) ALC1200 Yamaha AV Creative DTT2200
Power Supply Corsair RM-1000
Mouse SteelSeries Rival
Keyboard MS Sidewinder X6
Software Windows 10/11
I'm not a "Greeny" one, but im an ANTI-WATT escalation on the consumer market.
Where is these "Greedy" capitalists contributions for a better evolution of this world and their footprint of innovated technology advances to increased reduction of energy consumption...
I do not have the budget neither will take such hardware even with a good deal new/used, when seeing te current wattage of some cpus around, or the "Insane" gpu escalation, specially from the NVidia "Monster".
Sure we do have new great advances, lots of cool new things... but this is working a lot worst to all the world society and their economy... but seems to be great times for "We know who..."
 
Joined
Dec 25, 2020
Messages
5,814 (4.35/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I'm not sure this was the wisest move, considered you could have easily purchased a chip like a Core i9-14900T and upgraded to a much more modern Nvidia GPU such as the 4080 Super, undervolted it and made use of DLSS to further save power, performing leaps and bounds ahead of the RX 6900 XT all while using *significantly* less power. Besides, are you even running a 80 Plus Titanium power supply capable of 90% conversion efficiency at 10% load to make the best out of these power savings? It's become increasingly overlooked as people want to aggressively save some watts - if you have a Gold or even Platinum-grade power supply, they are often 60 to 80% efficient at 10% load, meaning that you are actually wasting power by having a reduced load.

In fact, simply upgrading the graphics card to an Nvidia model and using DLSS would have beat out all other power savings I can see here - DLSS with Frame Generation makes it possible to play games like Starfield targeting 4K/120 at below 150 watts GPU power on a 4080, and that is from experience
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,776 (1.97/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus Block, HWLABS Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 2x A4x10, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+, TOFU-R CNC Alu/Brass, SS Prismcaps W, Jellykey, Lube/Mod, TLabs Leath/Suede Wristrest
Software Windows 10 IoT Enterprise LTSC 19044.4046
Benchmark Scores Legendary
I'm not sure this was the wisest move, considered you could have easily purchased a chip like a Core i9-14900T and upgraded to a much more modern Nvidia GPU such as the 4080 Super, undervolted it and made use of DLSS to further save power, performing leaps and bounds ahead of the RX 6900 XT all while using *significantly* less power. Besides, are you even running a 80 Plus Titanium power supply capable of 90% conversion efficiency at 10% load to make the best out of these power savings? It's become increasingly overlooked as people want to aggressively save some watts - if you have a Gold or even Platinum-grade power supply, they are often 60 to 80% efficient at 10% load, meaning that you are actually wasting power by having a reduced load.

In fact, simply upgrading the graphics card to an Nvidia model and using DLSS would have beat out all other power savings I can see here - DLSS with Frame Generation makes it possible to play games like Starfield targeting 4K/120 at below 150 watts GPU power on a 4080, and that is from experience
Spot on.

Additionally, % energy efficiency improvements don't necessarily convey that a high end CPU will use 50-100 W under gaming load, but a high end GPU will use 2-6x that. So improving GPU efficiency is significantly more effective at reducing power consumption than moving to a CPU that has the same % efficiency improvement.

High efficiency PSUs are also underrated, not only do you reduce heat output into case, but also typically improve power quality sent to the components, prolonging their life.

1724329559522.png
 
Joined
Jul 29, 2023
Messages
46 (0.12/day)
System Name Shirakami
Processor 7800X3D 80C PBO Level 5 @ -35 / 1.17 SoC / 2100 IF clock / 1:1 memory
Motherboard Gigabyte B650i AORUS Ultra
Cooling Corsair iCUE H100i ELITE CAPELLIX w/ ultra slim 2x120mm fans
Memory 2x24GB Hynix M-Die @ 6000 (28-36-28-30-28) - 65k tREFI 1.45 VDD / 1.20 VDDQ
Video Card(s) 6900 XT Reference
Storage 1 & 2 TB NVMe - 1 TB SATA SSD
Display(s) LG 34GN850 (3440x1440) 160 Hz overclock
Case Lian Li Q58
VR HMD Reverb G2 V2
Software Fedora Linux 40 operating system (daily) - Win10 (Virtual Reality stuff)
D
Doing an entire platform sidegrade to save 56 watts under gaming load while adding 25 watts on idle.

Uneasy about undervolting/disabling HT/disabling E cores (or using Win 10 for gaming and Process Lasso, or Win 11 for the scheduler) since ~10% performance lost. But then running a 3-600 W GPU at 180 W to save power (also resulting in a performance loss).

View attachment 360100

This entire concept is... confusing.

Nothing about my comment is aggressive, it's simply confused questioning instead of the praise you may have expected.

View attachment 360097

Note these CPU gaming tests are done with a 4090, so they represent worst case scenario for gaming power draw (when paired with fastest GPU).

I didn't "sidegrade" only because of power draw, I listed many other problems I had. It took me weeks to tune the PLL's, IMC, and SA voltages to achieve 7200 MT/s - all of the voltages sweetspot on LGA1700, and by sweetspot I mean 0.020mV means the difference between stability and errors 24 hours into Karhu. AM5 doesn't have that issue, when you set 6000 ~ 6200 ~ 6400 - it either works or it doesn't. And the boards will, for the most part, happily run speeds in excess of 7200 MT/s even if the CPU itself can't utilize such high bandwidth.

I also discovered weeks into purchasing my system that the IHS will bend to degrees that are unacceptable due to thermal cycling, resulting in me having to purchase a third-party loading mechanism, take apart my system entirely - and installing it. Three times. Because the loading mechanism has the capability of being too loosely installed, resulting in the IMC which already has drifting stability issues compounding to be even worse since mounting pressure is very key to achieving memory stability.

I also mentioned that the IHS, radiator, and internal case air become saturated during gaming, resulting in the CPU thermal throttling during AVX2 enhanced loadscreens. Straight to 95C, even with an undervolt.

Also, the GPU I'm running loses 6% performance when undervolted - going from 255w down to 180w, when the SoC isn't factored in. I don't want to purchase an RTX 4080, it's over 1000 USD and I disagree with Nvidia's current pricing lineup and will not support it.

The entire point of this post was to list the issues I had with LGA1700 - not the issues that EVERYONE has with LGA1700. There are hundreds of thousands of people more than happy with their systems. I wanted to state the reasons why I switched and why I am not recommending LGA1700 to my friends currently.

.. and it's a lot more than 56 watts. Whether it's this HardwareUnboxed video which proves as much, or my own personal experiences.

1724329782285.png
 

FreedomEclipse

~Technological Technocrat~
Joined
Apr 20, 2007
Messages
23,759 (3.75/day)
Location
London,UK
System Name Codename: RX-7878 (Mk.1) (Soon™)
Processor AMD 7800X3D
Motherboard MSI X670E GAMING PLUS
Cooling Thermalright PS120SE
Memory G.Skill Trident Z5 NEO DDR5 6000 CL32-38-38-96
Video Card(s) Sapphire Nitro+ AMD Radeon™ RX 7800 XT Gaming OC 16GB
Storage WD SN770 1TB (Boot)| 2x 2TB WD SN770 (Gaming)| 2x 2TB Crucial BX500| 2x 3TB Toshiba DT01ACA300
Display(s) LG GP850-B
Case Corsair 760T (White) {1xCorsair ML120 Pro|5xML140 Pro}
Audio Device(s) Yamaha RX-V573|Speakers: JBL Control One|Auna 300-CN|Wharfedale Diamond SW150
Power Supply Seasonic Focus GX-850 80+ GOLD
Mouse Logitech G502 X
Keyboard Duckyshine Dead LED(s) III
Software Windows 11 Home
Benchmark Scores (ノಠ益ಠ)ノ彡┻━┻
7800X3D master race. :rockout:
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,776 (1.97/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus Block, HWLABS Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 2x A4x10, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+, TOFU-R CNC Alu/Brass, SS Prismcaps W, Jellykey, Lube/Mod, TLabs Leath/Suede Wristrest
Software Windows 10 IoT Enterprise LTSC 19044.4046
Benchmark Scores Legendary
it's a lot more than 56 watts. Whether it's this HardwareUnboxed video which proves as much, or my own personal experiences.
I trust TPU testing as it's reliable, and uses the correct "stock" settings from CPU manufacturers, rather than whatever the motherboard manufacturer decided to throw at the wall to get better "reviews" from YouTubers who test out of the box Cinebench runs and in game benchmarks, rather than the custom scenes TPU uses.

AM5 not having issues running EXPO 6000 isn't really an argument for the platform over Intel, since Intel doesn't have issues at XMP 6000-6400 MT either, at least with 32/48 GB kits, but at least it has the option of going much higher, plus gets better numbers even with identical MT/timings anyway (96/83/85 compared to 78/78/70 GB/s, both at 6000 MT). Seems you tried to push 7200 and had trouble, likely due to a Z690 motherboard or perhaps a bad memory kit, I find it hard to believe a Raptor Lake CPU genuinely was the cause of memory struggles at 7200 MT on a two DIMM board.

I see, so you bought AMD because of a moral argument against NVIDIA, cool.

Hopefully you got a decent price on the platform sidegrade at least, considering Zen 5 is out now, that affected prices a bit, though the 7800X3D seems to have held it's value. Motherboards at least have gotten cheaper, and the refresh boards don't offer anything new, so not missing out on anything there.

A word of advice, don't run IF out of sync with RAM, I see you're using 2100 MHz, but are running 3000 MHz/6000 MT RAM. You'll ironically get consistently better gaming performance from 2000 MHz IF, so 1/1.5/1.5 sync isn't broken.

Enjoy your 7800X3D.
 
Joined
Nov 13, 2007
Messages
10,464 (1.71/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.4, 4.8Ghz Ring 190W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
I bought the day 1 13700KF and ran it at stock with an air cooler in an ITX case - I was able to pass a stock cinebench run without throttling on a simple peerless assassin with my chip - I know not all chips are the same.

I too ended up undervolting after doing some benches and seeing a 2 fps difference between 5.6 ghz and 5.3 ghz - tuned the ring and bought some cheapo 7600mhz ram and sold my old ddr5 kit to scratch the upgrade itch.

I think as any platform ages you will fall out of love with it for X reason - the 13700K's power draw, for when it was released, was fine -not great. At that time it was perfroming like a 12900ks for less $$ and W. It drew more than the 7700x but also had like 40% higher productivity and also better gaming performance - so it compared favorably to zen4 and now even zen5.

If your primary use is gaming tho the 7800x3d can't be beat. The whole power argument I always found to be moot because the 13700k idles at
1724331805375.png

And unless you're gaming 24/7 chances are you're idling more than you are actually loading (especially during normal desktop use). So your annual electricity bill is probably higher with the 7800x3d if you leave run your rig on for any amount of time outside of gaming. Again, doesn't matter - we're not talking massive numbers here - but during work/desktop usage it's running 15-20W lower consistently than the AM5 builds.
 
Top