• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

I switched from LGA1700 to AM5, here are my thoughts

Joined
Jul 29, 2023
Messages
48 (0.10/day)
System Name Shirakami
Processor 7800X3D / 2200 IF & -30 CO
Motherboard Gigabyte B650i AORUS Ultra
Cooling Corsair iCUE H100i ELITE CAPELLIX w/ ultra slim 2x120mm fans
Memory 2x24GB Hynix M-Die @ 6000 - 26-35-28-30 - 65k tREFI 1.55 VDD / 1.10 VDDQ / 1.10 SoC
Video Card(s) 6900 XT Reference / -120 mV @ 2.4 GHz
Storage 1 & 2 TB NVMe - 1 TB SATA SSD
Display(s) LG 34GN850 (3440x1440) 160 Hz overclock
Case Lian Li Q58
VR HMD Reverb G2 V2
Software Fedora Linux 40 operating system (daily)
Earlier this week, I made the jump from my 13700k to a 7800X3D. Oddly enough, I didn't do this because of degradation or the current Intel debacle. Rather, I switched because of power draw and thread scheduling. My system was completely stable in the one year I owned it. My specifications were as follows,

13700k - ASUS Z690 ITX - DDR5 7200 2x24GB - 6900 XT

My new specifications are

7800X3D - Gigabyte B650i AORUS ULTRA - 7200 2x24GB @ 6000 - 6900 XT

My Intel system was fairly tuned (~10% more performance, mostly from the memory and ring) and the 7800X3D is.. as tuned as you can make it (~7% more performance).

To begin with, I loved my 13700k. I found it to be a wonderful CPU and a nice upgrade from a 10900k. Unfortunately, the power draw was quite immense in modern games - Cyberpunk 2077 averaged around 140 watts with spikes upwards of 165 on occasion (Hwinfo64 CPU package power). I tried mitigating this by disabling Hyperthreading, but the power draw only lowered to about 125 with spikes up to 150. I also tried disabling efficiency cores (HT still enabled), the power draw lowered to around 120 with spikes up to 140. Disabling both HT and e-cores resulted in the most substantial loss of power, going all the way down to 115 watts with no real spikes to speak of. The performance difference between all three setups was less than 10%, which is one of the main driving factors that pushed me towards changing platforms. If it's pulling upwards 140 watts with current games, I can't even imagine what the CPU will draw in five years from now.. and electricity isn't getting any cheaper.

Disabling HT feels bad, even if it's not a huge driving factor to gaming performance. Disabling the e-cores feels even worse, considering those are tangible cores unlike HT, where more instructions being shoved into the transistors on each clock cycle. There's also the argument that disabling either of those features will hurt gaming performance in the future - leaving you with only one option. Underclocking. You can set a flat 5 GHz p-cores and 3.9 GHz e-cores at a fairly reasonable voltage; the power consumption gets cut by roughly 25% in gaming scenarios.. but you also completely lose out on Raptor Lake being a monstrosity when it comes to clockspeed. It feels just as bad as the above concessions. There is also the issue of thread scheduling; I do not run Windows 11, I daily Linux which didn't get proper support for Raptor Lake until Kernel 6.6 in October 2023. I use the term support loosely, because in gaming scenarios, the e-cores still become loaded with instructions. Windows 10 was and is my go-to for whenever I need to dual boot, however, the thread scheduler in Windows 10 isn't anywhere near as nice when compared to Windows 11. It's a similar issue to Linux wherein the efficiency cores are loaded with instructions, causing noticeable microstutters in gaming situations.

Add all of the issues I've listed above, including the poor memory controller found on 13th and 14th generation - with stability drifting on and off when aiming for speeds beyond 7200 MT/s - alongside the IHS bending, requiring a third-party loading mechanism (which I purchased) - not to mention the current degradation due to the SVID requesting upwards of 1.65v ..

Needless to say, after over a year of ownership, I was fairly sick of it all.

The power was a problem I needed to solve. I undervolted + overclocked the system and set a strict power limit of 175W to assure AVX2 workloads wouldn't cause crashing. It only needed to be stable for games, not heavy workloads, therefore the voltage I aimed for was whatever could run Cyberpunk 2077 at a CPU bind.

The scheduling was an issue I needed to solve, so I purchased Process Lasso for when I use W10 - which I highly recommend, it's a wonderful application.

The bending was an issue I needed to solve, so I purchased the ThermalTake loading mechanism.

The lack of memory stability at XMP speeds (7200) was an issue I needed to solve, which took weeks of tuning PLL voltages, VDD & VDDQ, IMC voltages, SA voltages.. etc.

The heat itself was an issue I needed to solve; I was highly uncomfortable with my CPU pulling 140 watts in games only to thermal throttle during loadscreens since the radiator, IHS, and liquid were all saturated. (240mm AIO).

The only saving grace of my build was the ASUS Z690 Gaming Wifi ITX motherboard. I absolutely adored that motherboard, it was overengineered and the VRM's didn't ever rise past 65C in stress tests.. and the single threaded performance. Raptor Lake has absolutely insane ST performance. My 7800X3D achieves 655 points in CPU-Z single threaded, while the 13700k is upwards of 900.

Admittedly, as of writing this, I haven't been on AM5 long enough to have experienced SoC voltages murdering AM5 CPU's outright. I also wasn't around for when memory training took upwards of 5 minutes. And I wasn't here for when core parking was a problem either. However, in the here-and-now, I'm absolutely in-love with the ease-of-use of my 7800X3D. It performs about the same, sometimes better, than my 13700k did in gaming situations.. while drawing 40 watts. If I were recommending a system to a friend today, I couldn't in good conscience recommend LGA1700 - the experience I had with it was quite negative, even if the CPU itself was blistering fast.

I hope that whatever Intel releases for their next generation is more refined. Competition is a very, very good thing - without it, we wouldn't even have an alternative like the 7800X3D to choose from. I love both teams. I even purchased a 1700X the first day it came out and dealt with the fiasco that was X370 BIOS' being very buggy. I loved my 10900k as well, it was a wonderful CPU but made very obsolete when 12th generation released - considering the e-cores were almost on par in ST performance. The 3570k I had before that was rock-solid, I never had any issues with it whatsoever. But this entire LGA1700 has just been a mess for me personally; in retrospect I wish that I had just purchased a 12900k in place of the 13700k. At least on Alder Lake, disabling the efficiency cores granted you a boost to cache speed - so it wasn't a complete loss.

But as things stand, I'll be holding on this chip for a while. I should note the 7800X3D did not exist at the time of me purchasing my 13700k.
 
Joined
Feb 10, 2007
Messages
2,698 (0.42/day)
Location
Oulu, Finland
System Name Enslaver :)
Processor Ryzen 7 7800X3D
Motherboard ASUS TUF Gaming B650-Plus
Cooling CPU: Noctua D15 G2, Case: 2 front in, 1 rear out
Memory 2x16GB Kingston Fury Beast RGB 6000MHz
Video Card(s) ASUS TUF RTX 4070Ti OC
Storage Samsung Evo Plus 1TB NVMe , internal WD Red 4TB for storage, WD Book 8TB
Display(s) LG CX OLED 65"
Case Lian Li LANCOOL II Mesh C Performance
Audio Device(s) HDMI audio powering Dolby Digital audio on 5.1 Z960 speaker system
Power Supply Corsair RM850x
Mouse Logitech G700
Keyboard ASUS Strix Tactic Pro
Software Windows 11 Pro x64
really 200w intel vs 80w amd makes a difference in power bill total
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,992 (2.00/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
really 200w intel vs 80w amd makes a difference in power bill total
Plus an extra 25 W at all times when not under load.

1724327515467.png

If you're so concerned about power draw why are you using an AMD GPU?

I'm also a little confused as to -
Evelynn said:
the poor memory controller found on 13th and 14th generation
Compared to what? AMD's setup that can't go past 6400? Or more realistically, 6200 MT. Raptor Lake can do 7200 on any four DIMM board, 7600 on any two DIMM board, and with some effort, 8000+. Karhu stable.

This results in ~125 GBs/read/write/copy with ns around 45-50 compared to Zen 4/5 ~70-80 GB/s @ 55 ns.

As for the degradation issues, it's been patched, if that's still not enough, simply set voltages manually, presto, problem solved. If that's still not enough, Intel extended the warranty by two years on top of the base warranty.

Anyway, it's an odd time to sidegrade to the competing platform, with Zen 5 X3D around the corner along with Arrow Lake.
 
Joined
Jul 29, 2023
Messages
48 (0.10/day)
System Name Shirakami
Processor 7800X3D / 2200 IF & -30 CO
Motherboard Gigabyte B650i AORUS Ultra
Cooling Corsair iCUE H100i ELITE CAPELLIX w/ ultra slim 2x120mm fans
Memory 2x24GB Hynix M-Die @ 6000 - 26-35-28-30 - 65k tREFI 1.55 VDD / 1.10 VDDQ / 1.10 SoC
Video Card(s) 6900 XT Reference / -120 mV @ 2.4 GHz
Storage 1 & 2 TB NVMe - 1 TB SATA SSD
Display(s) LG 34GN850 (3440x1440) 160 Hz overclock
Case Lian Li Q58
VR HMD Reverb G2 V2
Software Fedora Linux 40 operating system (daily)
Plus an extra 25 W at all times when not under load.

View attachment 360088
If you're so concerned about power draw why are you using an AMD GPU?

I'm also a little confused as to -

Compared to what? AMD's setup that can't go past 6400? Or more realistically, 6200 MT. Raptor Lake can do 7200 on any four DIMM board, 7600 on any two DIMM board, and with some effort, 8000+. Karhu stable.

This results in ~125 GBs/read/write/copy with ns around 45-50 compared to Zen 4/5 ~70-80 GB/s @ 55 ns.

As for the degradation issues, it's been patched, if that's still not enough, simply set voltages manually, presto, problem solved. If that's still not enough, Intel extended the warranty by two years on top of the base warranty.

Anyway, it's an odd time to sidegrade to the competing platform, with Zen 5 X3D around the corner along with Arrow Lake.

I'm not too concerned with idle voltage, since I typically have an application open in the background that forces my cores to be at ~15% utilization at all times.

I use an AMD GPU because it's 16 GB and the RTX 3080 it replaced was 10GB - which caused me to run out of VRAM in Virtual Reality games very, very often - and still would, if I hadn't switched. It's also undervolted and underclocked to pull only 180w.

I didn't say the AMD IMC was better, it's definitely not. The 7800X3D has one CCX, resulting in a nominal speed of 64 GB/s read - compared to the 110+ of my 13700k.

You know full well that degradation can't be patched out. What Intel did is forced the SVID to be at board vendor's Fail Safe modes and implemented amperage limits across the lineup. Your CPU won't crash if it's degraded because it's going to throttle on amperage first.

Weird and overly aggressive comment to my recollection of the last year of dealing with LGA 1700's problems but you do you.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,992 (2.00/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
Doing an entire platform sidegrade to save 56 watts under gaming load while adding 25 watts on idle.

Uneasy about undervolting/disabling HT/disabling E cores (or using Win 10 for gaming and Process Lasso, or Win 11 for the scheduler) since ~10% performance lost. But then running a 3-600 W GPU at 180 W to save power (also resulting in a performance loss).

1724328204771.png


This entire concept is... confusing.

Nothing about my comment is aggressive, it's simply confused questioning instead of the praise you may have expected.

1724328025297.png


Note these CPU gaming tests are done with a 4090, so they represent worst case scenario for gaming power draw (when paired with fastest GPU).
 
Joined
May 20, 2020
Messages
194 (0.12/day)
Location
WorldWideWeb
System Name ROG
Processor i5-11400F
Motherboard ROG Strix Z490-H Gaming
Cooling Be Quiet Pure Rock (BK009)
Memory Kingston Fury RENEGADE KF3600C16D4/16GX (KIT 2x16)
Video Card(s) GTX1060 6Gb AERO ITX OC
Storage NVMe WD Black SN750 ; Samsung EVO 840 SSD and Mechanical TBs...
Display(s) iiyama Black Hawk G-Master GB2745HSU
Case CM HAF-X
Audio Device(s) Yamaha AV Creative DTT2200
Power Supply Corsair RM-1000
Mouse SteelSeries Rival
Keyboard MS Sidewinder X6
Software Windows 10/11
I'm not a "Greeny" one, but im an ANTI-WATT escalation on the consumer market.
Where is these "Greedy" capitalists contributions for a better evolution of this world and their footprint of innovated technology advances to increased reduction of energy consumption...
I do not have the budget neither will take such hardware even with a good deal new/used, when seeing te current wattage of some cpus around, or the "Insane" gpu escalation, specially from the NVidia "Monster".
Sure we do have new great advances, lots of cool new things... but this is working a lot worst to all the world society and their economy... but seems to be great times for "We know who..."
 
Joined
Dec 25, 2020
Messages
6,522 (4.63/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I'm not sure this was the wisest move, considered you could have easily purchased a chip like a Core i9-14900T and upgraded to a much more modern Nvidia GPU such as the 4080 Super, undervolted it and made use of DLSS to further save power, performing leaps and bounds ahead of the RX 6900 XT all while using *significantly* less power. Besides, are you even running a 80 Plus Titanium power supply capable of 90% conversion efficiency at 10% load to make the best out of these power savings? It's become increasingly overlooked as people want to aggressively save some watts - if you have a Gold or even Platinum-grade power supply, they are often 60 to 80% efficient at 10% load, meaning that you are actually wasting power by having a reduced load.

In fact, simply upgrading the graphics card to an Nvidia model and using DLSS would have beat out all other power savings I can see here - DLSS with Frame Generation makes it possible to play games like Starfield targeting 4K/120 at below 150 watts GPU power on a 4080, and that is from experience
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,992 (2.00/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
I'm not sure this was the wisest move, considered you could have easily purchased a chip like a Core i9-14900T and upgraded to a much more modern Nvidia GPU such as the 4080 Super, undervolted it and made use of DLSS to further save power, performing leaps and bounds ahead of the RX 6900 XT all while using *significantly* less power. Besides, are you even running a 80 Plus Titanium power supply capable of 90% conversion efficiency at 10% load to make the best out of these power savings? It's become increasingly overlooked as people want to aggressively save some watts - if you have a Gold or even Platinum-grade power supply, they are often 60 to 80% efficient at 10% load, meaning that you are actually wasting power by having a reduced load.

In fact, simply upgrading the graphics card to an Nvidia model and using DLSS would have beat out all other power savings I can see here - DLSS with Frame Generation makes it possible to play games like Starfield targeting 4K/120 at below 150 watts GPU power on a 4080, and that is from experience
Spot on.

Additionally, % energy efficiency improvements don't necessarily convey that a high end CPU will use 50-100 W under gaming load, but a high end GPU will use 2-6x that. So improving GPU efficiency is significantly more effective at reducing power consumption than moving to a CPU that has the same % efficiency improvement.

High efficiency PSUs are also underrated, not only do you reduce heat output into case, but also typically improve power quality sent to the components, prolonging their life.

1724329559522.png
 
Joined
Jul 29, 2023
Messages
48 (0.10/day)
System Name Shirakami
Processor 7800X3D / 2200 IF & -30 CO
Motherboard Gigabyte B650i AORUS Ultra
Cooling Corsair iCUE H100i ELITE CAPELLIX w/ ultra slim 2x120mm fans
Memory 2x24GB Hynix M-Die @ 6000 - 26-35-28-30 - 65k tREFI 1.55 VDD / 1.10 VDDQ / 1.10 SoC
Video Card(s) 6900 XT Reference / -120 mV @ 2.4 GHz
Storage 1 & 2 TB NVMe - 1 TB SATA SSD
Display(s) LG 34GN850 (3440x1440) 160 Hz overclock
Case Lian Li Q58
VR HMD Reverb G2 V2
Software Fedora Linux 40 operating system (daily)
D
Doing an entire platform sidegrade to save 56 watts under gaming load while adding 25 watts on idle.

Uneasy about undervolting/disabling HT/disabling E cores (or using Win 10 for gaming and Process Lasso, or Win 11 for the scheduler) since ~10% performance lost. But then running a 3-600 W GPU at 180 W to save power (also resulting in a performance loss).

View attachment 360100

This entire concept is... confusing.

Nothing about my comment is aggressive, it's simply confused questioning instead of the praise you may have expected.

View attachment 360097

Note these CPU gaming tests are done with a 4090, so they represent worst case scenario for gaming power draw (when paired with fastest GPU).

I didn't "sidegrade" only because of power draw, I listed many other problems I had. It took me weeks to tune the PLL's, IMC, and SA voltages to achieve 7200 MT/s - all of the voltages sweetspot on LGA1700, and by sweetspot I mean 0.020mV means the difference between stability and errors 24 hours into Karhu. AM5 doesn't have that issue, when you set 6000 ~ 6200 ~ 6400 - it either works or it doesn't. And the boards will, for the most part, happily run speeds in excess of 7200 MT/s even if the CPU itself can't utilize such high bandwidth.

I also discovered weeks into purchasing my system that the IHS will bend to degrees that are unacceptable due to thermal cycling, resulting in me having to purchase a third-party loading mechanism, take apart my system entirely - and installing it. Three times. Because the loading mechanism has the capability of being too loosely installed, resulting in the IMC which already has drifting stability issues compounding to be even worse since mounting pressure is very key to achieving memory stability.

I also mentioned that the IHS, radiator, and internal case air become saturated during gaming, resulting in the CPU thermal throttling during AVX2 enhanced loadscreens. Straight to 95C, even with an undervolt.

Also, the GPU I'm running loses 6% performance when undervolted - going from 255w down to 180w, when the SoC isn't factored in. I don't want to purchase an RTX 4080, it's over 1000 USD and I disagree with Nvidia's current pricing lineup and will not support it.

The entire point of this post was to list the issues I had with LGA1700 - not the issues that EVERYONE has with LGA1700. There are hundreds of thousands of people more than happy with their systems. I wanted to state the reasons why I switched and why I am not recommending LGA1700 to my friends currently.

.. and it's a lot more than 56 watts. Whether it's this HardwareUnboxed video which proves as much, or my own personal experiences.

1724329782285.png
 

FreedomEclipse

~Technological Technocrat~
Joined
Apr 20, 2007
Messages
23,987 (3.74/day)
Location
London,UK
System Name DarnGosh Edition
Processor AMD 7800X3D
Motherboard MSI X670E GAMING PLUS
Cooling Thermalright AM5 Contact Frame + Phantom Spirit 120SE
Memory G.Skill Trident Z5 NEO DDR5 6000 CL32-38-38-96
Video Card(s) Asus Dual Radeon™ RX 6700 XT OC Edition
Storage WD SN770 1TB (Boot)| 2x 2TB WD SN770 (Gaming)| 2x 2TB Crucial BX500| 2x 3TB Toshiba DT01ACA300
Display(s) LG GP850-B
Case Corsair 760T (White) {1xCorsair ML120 Pro|5xML140 Pro}
Audio Device(s) Yamaha RX-V573|Speakers: JBL Control One|Auna 300-CN|Wharfedale Diamond SW150
Power Supply Seasonic Focus GX-850 80+ GOLD
Mouse Logitech G502 X
Keyboard Duckyshine Dead LED(s) III
Software Windows 11 Home
Benchmark Scores ლ(ಠ益ಠ)ლ

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,992 (2.00/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
it's a lot more than 56 watts. Whether it's this HardwareUnboxed video which proves as much, or my own personal experiences.
I trust TPU testing as it's reliable, and uses the correct "stock" settings from CPU manufacturers, rather than whatever the motherboard manufacturer decided to throw at the wall to get better "reviews" from YouTubers who test out of the box Cinebench runs and in game benchmarks, rather than the custom scenes TPU uses.

AM5 not having issues running EXPO 6000 isn't really an argument for the platform over Intel, since Intel doesn't have issues at XMP 6000-6400 MT either, at least with 32/48 GB kits, but at least it has the option of going much higher, plus gets better numbers even with identical MT/timings anyway (96/83/85 compared to 78/78/70 GB/s, both at 6000 MT). Seems you tried to push 7200 and had trouble, likely due to a Z690 motherboard or perhaps a bad memory kit, I find it hard to believe a Raptor Lake CPU genuinely was the cause of memory struggles at 7200 MT on a two DIMM board.

I see, so you bought AMD because of a moral argument against NVIDIA, cool.

Hopefully you got a decent price on the platform sidegrade at least, considering Zen 5 is out now, that affected prices a bit, though the 7800X3D seems to have held it's value. Motherboards at least have gotten cheaper, and the refresh boards don't offer anything new, so not missing out on anything there.

A word of advice, don't run IF out of sync with RAM, I see you're using 2100 MHz, but are running 3000 MHz/6000 MT RAM. You'll ironically get consistently better gaming performance from 2000 MHz IF, so 1/1.5/1.5 sync isn't broken.

Enjoy your 7800X3D.
 
Joined
Nov 13, 2007
Messages
10,679 (1.72/day)
Location
Austin Texas
System Name Planet Espresso
Processor 13700KF @ 5.5GHZ 1.285v - 235W cap
Motherboard MSI 690-I PRO
Cooling Thermalright Phantom Spirit EVO
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
I bought the day 1 13700KF and ran it at stock with an air cooler in an ITX case - I was able to pass a stock cinebench run without throttling on a simple peerless assassin with my chip - I know not all chips are the same.

I too ended up undervolting after doing some benches and seeing a 2 fps difference between 5.6 ghz and 5.3 ghz - tuned the ring and bought some cheapo 7600mhz ram and sold my old ddr5 kit to scratch the upgrade itch.

I think as any platform ages you will fall out of love with it for X reason - the 13700K's power draw, for when it was released, was fine -not great. At that time it was perfroming like a 12900ks for less $$ and W. It drew more than the 7700x but also had like 40% higher productivity and also better gaming performance - so it compared favorably to zen4 and now even zen5.

If your primary use is gaming tho the 7800x3d can't be beat. The whole power argument I always found to be moot because the 13700k idles at
1724331805375.png

And unless you're gaming 24/7 chances are you're idling more than you are actually loading (especially during normal desktop use). So your annual electricity bill is probably higher with the 7800x3d if you run your rig on for any amount of time outside of gaming. Again, doesn't matter - we're not talking massive numbers here - but during work/desktop usage it's running 15-20W lower consistently than the AM5 builds.
 
Last edited:
Joined
Dec 25, 2020
Messages
6,522 (4.63/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I don't want to purchase an RTX 4080, it's over 1000 USD and I disagree with Nvidia's current pricing lineup and will not support it.

Interesting, because the aforementioned 4080 Super has the exact same MSRP the 6900 XT had when it came out. Yet you purchased a 6900 XT all the same.
 
Joined
Jun 21, 2013
Messages
601 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
Have to agree with other comments. The reasons for this upgrade are really strange.

Blaming Intel that your 7200 ram is not stable, while running the same ram underclocked at 6000 on AMD.
Complaining about 13700K power draw vs 7800X3D while running a power hog of a GPU.
All that fiddling with HT and Ecores is also strange, when you could have just lowered the CPU power limit to 65W while losing maybe 2-3% FPS in games.
 
Joined
Dec 25, 2020
Messages
6,522 (4.63/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Have to agree with other comments. The reasons for this upgrade are really strange.

Blaming Intel that your 7200 ram does not get stable overclock to 7400, while running the same ram underclocked at 6000 on AMD.
Complaining about 13700K power draw vs 7800X3D while running a power hop of a GPU.
All that fiddling with HT and Ecores is also strange, when you could have just lowered the CPU power limit to 65W while losing maybe 1-2% FPS in games.

Yup, it's perfectly ok to get bored of something and upgrade it, I like exciting new things too - that's why I sold my 3090 and got the 4080 after all. No need to make up excuses for any of it :D
 
Joined
Feb 18, 2005
Messages
5,755 (0.80/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Guys, reread their post. It isn't just about the total watts dissipated, it's the fact that the heat soak from the CPU is far lower.

Yup, it's perfectly ok to get bored of something and upgrade it, I like exciting new things too - that's why I sold my 3090 and got the 4080 after all. No need to make up excuses for any of it :D
This is something I've also realised on these forums - a lot of people spend a lot of time trying to justify their hardware purchase to others, when the real reason is that they just wanted to buy something new and shiny. And that's totally fine! You don't have to justify anything that makes you happy and doesn't hurt others - it's your hard-earned cash!
 
Joined
Mar 19, 2024
Messages
24 (0.11/day)
Location
Pirkkala, Finland
System Name The Dark Princess XXIX
Processor Intel Core i9-14900KS | 1.45V Voltage Limit, LLC 6 & -120mV Undervolt
Motherboard Asus ROG Maximus Z790 Apex Encore
Cooling Arctic Liquid Freezer III 360 ARGB
Memory G.Skill Trident Z5 Royal DDR5 2x16GB 6400C32
Video Card(s) Sapphire Nitro+ RX 7900 XTX Vapor-X
Storage Samsung 990 PRO 2TB NVMe M.2
Display(s) Samsung Odyssey Smart G6 & Acer Nitro VG270U
Case Montech KING 95 PRO
Audio Device(s) Audio-Technica ATX-M50xBT2 & Sennheiser Profile
Power Supply Asus ROG Strix AURA Edition 1000W 80+ Gold
Mouse Razer Basilisk Ultimate
Keyboard Razer Blackwidow V3 Tenkeyless
At the end of the day, PC's are a hobby for alot of us and hobbies don't ask for justifications.
 
Joined
Apr 9, 2024
Messages
207 (1.00/day)
System Name Crapostrophic
Processor AMD Ryzen Z1 Extreme
Motherboard ASUS Custom PCB
Cooling Stock Asus Fan and Cooler Design
Memory 16GB of LPDDR5 running 6400mhz with tweaked timings
Video Card(s) AMD Radeon 780M APU
Storage 2TB Aorus 7300 Gen 4
Display(s) 7 Inch IPS Display @120hz
Case Plastic Shell Case designed by Asus
Audio Device(s) Asus ROG Delta
Power Supply 40WHrs, 4S1P, 4-cell Li-ion with a 65W PD Charger
Mouse Asus ROG Keris Wireless
Keyboard AKKO 3098B hotswapped to speed silver pro switches
Software Windows 11 Home (Debloated and tweaked)
oooofff, that reference to another HWUB screenshot, people nowadays "rely" for their help, when testing methodology isn't bulletproof or somewhat inconsistent from their run to run tests.
 
Joined
Jul 26, 2024
Messages
165 (1.67/day)
And unless you're gaming 24/7 chances are you're idling more than you are actually loading (especially during normal desktop use). So your annual electricity bill is probably higher with the 7800x3d if you run your rig on for any amount of time outside of gaming.
QFT.
Btw, 4070S uses only 6w in normal desktop/browser/YouTube use, 140-180w in gaming typically, and it pretty much ties with 6900xt on performance. It would be a better sidegrade if the OP was looking for power savings.
 
Last edited:
Joined
Jul 13, 2016
Messages
3,245 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
And unless you're gaming 24/7 chances are you're idling more than you are actually loading (especially during normal desktop use). So your annual electricity bill is probably higher with the 7800x3d if you leave run your rig on for any amount of time outside of gaming. Again, doesn't matter - we're not talking massive numbers here - but during work/desktop usage it's running 15-20W lower consistently than the AM5 builds.

Light desktop usage like browsing the web / doing office tasks is not the same as idle. Such a workload requires the CPU to boost one to three cores to a much higher frequency on their frequency table. For CPUs like the 13700K, 13900K, 14900K, 14700K, ect that pushes a lot of power to achieve high clocks that could very well mean they are less efficient in light usage.

If you read OP's post he explicitly points out that he is in fact always running something taking at least 15% of the CPU at all times.

Plus an extra 25 W at all times when not under load.

I think it's wise to point out that TPU's idle power numbers are total system power consumption. When comparing the actual consumption of the CPU at idle, both Intel and AMD are similar:


1724335574760.png



This is important to note because if you look at TPU's test setup it's using an X670 board, which is a dual chipset solution. This dual chip solution likely increases power consumption as compared to a single chip B-class motherboard like what the OP has. There's also going to be variability depending on which motherboard you choose as some vendors tend to favor performance over saving a few idle watts.

This is inherently the caveat of whole system power metrics, you have to consider that the platform itself is a variable that impacts the resulting number and it can have a significant impact when we are talking sub 86w. You can't rightfully say that OP's platform is less efficient at idle because you have not accounted for the difference in platform, particularly given that OP's B class mobo has 1 less chip.

Weird and overly aggressive comment to my recollection of the last year of dealing with LGA 1700's problems but you do you.

That's just dgianstgefani, they've been using that whole system idle power consumption metric as the straw they grasp anytime anyone on TPU dare say AMD is more efficient. Of course Zen is absolutely more efficient, he might have a point if he were to say the platform was less efficient at idle.

Just take a look at his system specs:

1724337431539.png


Says all you need to know about which he thinks is better.


Have to agree with other comments. The reasons for this upgrade are really strange.

Blaming Intel that your 7200 ram is not stable, while running the same ram underclocked at 6000 on AMD.
Complaining about 13700K power draw vs 7800X3D while running a power hog of a GPU.
All that fiddling with HT and Ecores is also strange, when you could have just lowered the CPU power limit to 65W while losing maybe 2-3% FPS in games.

You loose a lot more thann 2-3% FPS in games (and yes this isn't a 13700K power scaling chart but good luck finding that):

1724336688795.png


Couldn't find a 13700K power scaling chart but here's one for the 13900K:

1724337035799.png



95w is really the lowest you want to go with the higher end 13th and 14th gen Intel CPUs. Below that performance starts dropping off a cliff. The 14900K sees it's MT performance nearly halved as does the 13900K. The 13700K won't be as bad due to the less cores to feed but it certainly wouldn't make sense to buy such a CPU a limit performance so much.
 
Last edited:
Joined
Jul 29, 2023
Messages
48 (0.10/day)
System Name Shirakami
Processor 7800X3D / 2200 IF & -30 CO
Motherboard Gigabyte B650i AORUS Ultra
Cooling Corsair iCUE H100i ELITE CAPELLIX w/ ultra slim 2x120mm fans
Memory 2x24GB Hynix M-Die @ 6000 - 26-35-28-30 - 65k tREFI 1.55 VDD / 1.10 VDDQ / 1.10 SoC
Video Card(s) 6900 XT Reference / -120 mV @ 2.4 GHz
Storage 1 & 2 TB NVMe - 1 TB SATA SSD
Display(s) LG 34GN850 (3440x1440) 160 Hz overclock
Case Lian Li Q58
VR HMD Reverb G2 V2
Software Fedora Linux 40 operating system (daily)
Interesting, because the aforementioned 4080 Super has the exact same MSRP the 6900 XT had when it came out. Yet you purchased a 6900 XT all the same.
I bought the 6900 XT for 500 USD on sale towards the end of life from AMD's website.
I don't mind that it was nearly 1k USD either, because the die size is massive compared to what Nvidia is currently offering you and what they have offered in the past.

Not only that, but I use Linux - the drivers for AMD GPU's are baked into the kernel. Nvidia on Linux is not ideal as the drivers were closed-source until recently, and the open-source drivers are not very mature.
1724337383233.png
1724337396105.png
1724337399632.png
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,553 (2.89/day)
Location
Jyväskylä, Finland
System Name 4K-gaming
Processor AMD Ryzen 7 5800X @ PBO +200 -20CO
Motherboard Asus ROG Crosshair VII Hero
Cooling Arctic Freezer 50, EKWB Vector TUF
Memory 32GB Kingston HyperX Fury DDR4-3200
Video Card(s) Asus GeForce RTX 3080 TUF OC 10GB
Storage A pack of SSDs totaling 3.2TB + 3TB HDDs
Display(s) 27" 4K120 IPS + 32" 4K60 IPS + 24" 1080p60
Case Corsair 4000D Airflow White
Audio Device(s) Asus TUF H3 Wireless / Corsair HS35
Power Supply EVGA Supernova G2 750W
Mouse Logitech MX518 + Asus ROG Strix Edge Nordic
Keyboard Roccat Vulcan 121 AIMO
VR HMD Oculus Rift CV1
Software Windows 11 Pro
Benchmark Scores It runs Crysis
At least you don't need to worry about the Inhell chip breaking anymore. And 7800X3D is practically the best gaming CPU even today.

edit: that 6900 XT should still be fine for some time, my 3080 is a weaker card, yet I play at 4K120 which has way more pixels to render than your 1440p UW.
 
Joined
Dec 25, 2020
Messages
6,522 (4.63/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I bought the 6900 XT for 500 USD on sale towards the end of life from AMD's website.
I don't mind that it was nearly 1k USD either, because the die size is massive compared to what Nvidia is currently offering you and what they have offered in the past.
View attachment 360116View attachment 360117View attachment 360118

Sure, you got it on a clearance deal, valid. Ada isn't that old yet. But why are we comparing a TSMC N4P chip (bleeding edge node) to an ancient processor from 2010 and developed during the late 2000s?
 
Joined
Nov 13, 2007
Messages
10,679 (1.72/day)
Location
Austin Texas
System Name Planet Espresso
Processor 13700KF @ 5.5GHZ 1.285v - 235W cap
Motherboard MSI 690-I PRO
Cooling Thermalright Phantom Spirit EVO
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
I bought the 6900 XT for 500 USD on sale towards the end of life from AMD's website.
I don't mind that it was nearly 1k USD either, because the die size is massive compared to what Nvidia is currently offering you and what they have offered in the past.

Not only that, but I use Linux - the drivers for AMD GPU's are baked into the kernel. Nvidia on Linux is not ideal as the drivers were closed-source until recently, and the open-source drivers are not very mature.
View attachment 360116View attachment 360117View attachment 360118

1724337863589.png


other reviews will show a 10-15W idle / normal usage delta... there's a few of them.

Also losing 3% at 125W at 720 isn't bad....
 

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
26,849 (3.83/day)
Location
Alabama
System Name RogueOne
Processor Xeon W9-3495x
Motherboard ASUS w790E Sage SE
Cooling SilverStone XE360-4677
Memory 128gb Gskill Zeta R5 DDR5 RDIMMs
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 2TB WD SN850X | 2x 8TB GAMMIX S70
Display(s) Odyssey OLED G9 (G95SC)
Case Thermaltake Core P3 Pro Snow
Audio Device(s) Moondrop S8's on schitt Modi+ & Valhalla 2
Power Supply Seasonic Prime TX-1600
Mouse Lamzu Atlantis mini (White)
Keyboard Monsgeek M3 Lavender, Akko Crystal Blues
VR HMD Quest 3
Software Windows 11 Pro Workstation
Benchmark Scores I dont have time for that.
Nvidia on Linux is not ideal as the drivers were closed-source until recently, and the open-source drivers are not very mature.

Meanwhile

1724338112385.jpeg


1724338122999.png


1724338141637.png


1724338155797.png


1724338169898.png


1724338225459.jpeg



If you're going to parade as a linux enthusiast and god knows we need more of them, please do the research instead of echo chambering the last 20 years of GPUs on linux. It doesnt do anyone any good when you try hard to sell the environment as a whole and then say "but <fanboy argument even in this realm of computing>" makes everyone look bad.


As for the on topic part of this thread. big sad :c I idle at like 150w.
 
Top