• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

7800X3D vs 14900K video by HWUB. What would you choose for gaming?

Joined
Nov 13, 2007
Messages
10,383 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.4, 4.8Ghz Ring 190W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
I think you're overstating how much the CPU impacts performance of games. You know that I still game on my 3930k with stuff running in the background? The vast majority of the time I'm not CPU limited with some rare exceptions like Cities: Skylines 2. If my 11 year old 6c/12t CPU can handle these things fine for the most part, I assure you that a 7800X3D will as well. Also YouTube is likely GPU accelerated so there is also that.

Remember that one time that W1zz did a test for gaming on just the E cores? I suggest checking that out again. At 4k the difference was a whole whopping 5% (if you round up,) from E cores to P cores. The only time it really matters is if you're trying to pump out as many frames as possible with something like a 144hz display. If you're running at 60Hz though, probably doesn't make much difference.

It kind of depends, alot of newer games (AHEM starfield, jedi survivor, hogwarts legacy) can have horrible shader stutter and thread optimization, so throwing faster ram and to a lesser extent a better CPU will help make them tolerable at 120hz.

Averages tend to hide those game-ruining frame pacing issues. The X3D chips help alot, although they still have shader stutter, having a fast low latency cpu can hide them quite a bit.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,153 (2.88/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
can have horrible shader stutter
You can't blame a CPU for a game that doesn't precompile shaders. Inefficiency of games is a problem with software, not with hardware. The funny thing is that if I play something like Farm Sim 22 on my Macbook Pro, it has hardcore shader stutter for the first several minutes, but if I load it via Proton in Linux, it does it all ahead of time. So blame the software, not the hardware running it.

This is like saying your brakes on your car are stuck so you need a stronger engine to power through it. Makes no sense.
 
Joined
Jun 6, 2022
Messages
622 (0.79/day)
The video is objective since 7800X3D is less pricy, cooler, more efficient, not in need of ultra fast/expensive RAM, on an upgradeable platform and faster in games, which it the scope of this video testing. Very few people predicted the effect 3DVcache would have in the CPU world. It indeed turned the tides in gaming and efficiency. And now it got into servers. Until Intel brings something equally revolutionary in the CPU tech, they will be behind.
Objective? Don't you think a comparison between i9 and r9X3D was needed?
According to the TPU review, the 14600K destroys the 7800X3D in applications and only loses 2% in 4K gaming (no one buys the 4090 for 720p, I guess).
According to the same review, the 14600K destroys the 7800X3D in the Performance per $ chapter.

You are objective when you weigh all the qualities and defects. The defect of X3D is that it is much too expensive for what it offers and the trap many fall into is that these processors are only tested with the most powerful video cards.
 
Last edited by a moderator:
Joined
Nov 13, 2007
Messages
10,383 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.4, 4.8Ghz Ring 190W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
You can't blame a CPU for a game that doesn't precompile shaders. Inefficiency of games is a problem with software, not with hardware. The funny thing is that if I play something like Farm Sim 22 on my Macbook Pro, it has hardcore shader stutter for the first several minutes, but if I load it via Proton in Linux, it does it all ahead of time. So blame the software, not the hardware running it.

This is like saying your brakes on your car are stuck so you need a stronger engine to power through it. Makes no sense.

Right, but I can't control the software, I can only buy the hardware that alleviates those issues. I'm not going to re-factor the game.

Objective? Don't you think a comparison between i9 and r9X3D was needed?
According to the TPU review, the 14600K destroys the 7800X3D in applications and only loses 2% in 4K gaming (no one buys the 4090 for 720p, I guess).
According to the same review, the 14600K destroys the 7800X3D in the Performance per $ chapter.

You are objective when you weigh all the qualities and defects. The defect of X3D is that it is much too expensive for what it offers and the trap many fall into is that these processors are only tested with the most powerful video cards.

Not! You are not objective when you present the beauty of a girl, but you "forget" to say that she hates the kitchen and the vacuum cleaner.

They could have easily done this video with a 7950x and a 7950x3d and come to the same conclusion.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,091 (2.84/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Line6 UX1 + some headphones, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
OK, if you only play games and take a locked 8 core CPU worth $350 running at a moderate frequency as your baseline, why would you ever compare it to unlocked 24 core CPU worth $600 running so quickly as it can, when you know that the games can not utilise 32 threads?

The latter CPU would always consume more energy and be less efficient than the first one, even if they were from the same manufacturer.

BTW if you tune 14900K to run at lower frequencies and limit its power draw, it may became say 15% weaker than 7800X3D in gaming, but still say 70% stronger in productive tasks, while being efficient and very easy to cool. For some, that 70% more in productive tasks would justify paying 50% more for it, and the tuned 14900K would be more more balanced and universally usable CPU.

Because that's the point of the video. "A head to head between AMD's and Intel's best gaming processors" is the framing. It's not about "best CPU in this range" it's "Best gaming CPU money can buy", and it just so happens that a $3xx part beats a $5xx part.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,035 (1.28/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
and I got a strong vibe of it not being very objective. I got even a feeling that Steve will not be proud of this video after some time passes.
I never doubt the numbers that they present, they seem very reliable.

I do however think Steve adds far too much personal flair to the whole thing for my liking, his derogatory comments and general attitude is inflammatory and borderline trollish. I don't think he'll ever regret it though, it's been his style for years, and it's his platform.... I tend to skip his lengthy monologues and dribbling's at the end and just absorb the objective parts of the reviews, the charts

I will also add however that while I don't doubt the numbers he gets, he gets certain numbers by carefully crafting the testing methodology to show the result that he wanted to show all along. For example some 8GB VRAM stuff recently, some beautifully constructed tests to purposely cripple 8GB cards, largely using scenario's no sane person would, just to make his uhh... point.
 
Joined
Apr 30, 2008
Messages
4,880 (0.82/day)
Location
Multidimensional
System Name Boomer Master Race
Processor AMD Ryzen 7 7800X3D 4.2Ghz - 5Ghz CPU
Motherboard MSI B650I Edge Wifi ITX Motherboard
Cooling CM 280mm AIO + 2x 120mm Slim fans
Memory Kingston Fury 32GB 6000Mhz
Video Card(s) ASUS RTX 4070 Super 12GB OC
Storage Samsung 980 Pro 2TB + WD 2TB 2.5in HDD
Display(s) Sony 4K Bravia X85J 43Inch TV 120Hz
Case CM NR200P Max TG ITX Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply CoolerMaster V850 SFX Gold 850W PSU
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 10 Home 64bit
Benchmark Scores Don't do them anymore.
OP & Gica you're both coping hard, you sure you're not from Userbenchmark? Stop overthinking something so simple.
 
Joined
Jul 13, 2016
Messages
3,045 (1.03/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
BTW if you tune 14900K to run at lower frequencies and limit its power draw, it may became say 15% weaker than 7800X3D in gaming, but still say 70% stronger in productive tasks, while being efficient and very easy to cool. For some, that 70% more in productive tasks would justify paying 50% more for it, and the tuned 14900K would be more more balanced and universally usable CPU.

There's no sense in power limiting a 14900K when the 7950X3D is going to beat a power limited 14900K it in every aspect including power consumption. Both CPUs are the same price and have about the same performance out of the box. The major difference between the two is that the 7950X3D is vastly more efficient. Power limiting the 14900K will bring it closer to the 7950X3D but not enough to make anyone care and in the process power limiting will make it vastly slower in multi-threaded benchmarks and slower in games. At the end of the day a power limited 14900K is a really lame version of a stock 7950X3D and that's before considering that you can set the 7950X3D to prefer cache and set a negative PBO offset to increase gaming performance and efficiency even further. Intel really needs to be offering the 14900K at a lower cost than the 7950X3D to make sense. It doesn't take the gaming crown from the 7800X3D, it's not efficient, and it doesn't beat the best all-rounder in the 7950X3D. It's also on a dead platform to boot. Really it needs to be at least $100 cheaper than it's current price, currently the 7950X3D is cheaper.

it depends - the 14900k is an all-rounder... if you have discord and youtube in the background (how I like to play games) I imagine the 14900K wins, if you're looking for the best power/perf while gaming then get the 7800x3d.

But overall -- it really doesn't matter. at all. you won't be able to tell a difference between these two.


The 7800X3D would almost certainly still win in that scenario. I have a 7800X3D and routinely have multiple things open in the background without an issue. You are assuming that the 7800X3D is 100% pegged by a game but I have yet to see a single game that does that. This isn't like back when Intel only offered 4 cores that were 100% utilized by games at the time before zen came in and introduces cheap 8 core CPUs, 8 cores is enough to play games and have background apps running without stuttering.

Right -- you can do that with an i3 - my point was that the margin between the top CPUs is so slim that background apps could change it... basically it doesn't matter. AM5 is a much better buy now though you would really need to want intel to get the 14900K.

The margin between the 14900K and 7800X3D in general is small enough to not matter unless you seek the absolute best gaming performance. That said your proposed scenario could very well tilt performance in the favor of the 7800X3D depending on which tasks are assigned to which cores in the Intel processor. It could very well be that something latency sensitive is kicked to an e-core on the 14900K and that causes performance to degrade. That's the advantage of the 7800X3D, there are no slow cores. Just fat cores with a fat cache to ensure consistent low latency. It's impossible to say for sure though, you'd have to make a benchmark could test that objectively and no one has ever done that. It's one thing to make the claim that a processor pegged at 100% utilization will see a performance dip when another apps are opened but when it's running at 60-70% on average? It's anyone's guess.

I never doubt the numbers that they present, they seem very reliable.

I do however think Steve adds far too much personal flair to the whole thing for my liking, his derogatory comments and general attitude is inflammatory and borderline trollish. I don't think he'll ever regret it though, it's been his style for years, and it's his platform.... I tend to skip his lengthy monologues and dribbling's at the end and just absorb the objective parts of the reviews, the charts

I will also add however that while I don't doubt the numbers he gets, he gets certain numbers by carefully crafting the testing methodology to show the result that he wanted to show all along. For example some 8GB VRAM stuff recently, some beautifully constructed tests to purposely cripple 8GB cards, largely using scenario's no sane person would, just to make his uhh... point.

Certain parts of constructing testing methology will always be criticized as there is leeway in deciding how things are tested and some of that is subjective. That's why it's important to read multiple reviews from known professionals. Even if you don't agree with their results, it's important to have another valid data set. For example, I absolutely hated Tomshardware's last GPU testing suite game selection due to the number of performance outliers that were in it compared to other testing suites. I in general would prefer to limit outliers as much as possible unless the data set is very large.

Averages tend to hide those game-ruining frame pacing issues. The X3D chips help alot, although they still have shader stutter, having a fast low latency cpu can hide them quite a bit.

True, this is why I like to see frame-time charts that give you an instant idea of the smoothness of a title.
 
Last edited:
Joined
Nov 13, 2007
Messages
10,383 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.4, 4.8Ghz Ring 190W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Steve isn't going to make a benchmarking video of him with his music and twitch stream open and post the fps he's getting then with 14900K vs 7800X3D. Hell he doesn't even use MSI boards on AM5 because he knows Aorus/Gigabyte have the best AGESA timings and give him 10% more FPS with the Cl30 6000 kits.

I want to see benches with stuff open in the background:
1702527463219.png


When they used to do it with alder lake (optimum PC did one) and a few others as well, and (not surprisingly) the CPU with more cores won, even if they weren't pegged.

But his point stands either way... the 14900K is kind of a shite product $/performance if you're only gaming. If you're a streamer though.... eh.? maybe? I doubt we will see that review tho.

They did one a while ago: (300 tabs open)
1702527769673.png
 
Joined
Sep 10, 2018
Messages
6,140 (2.85/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
At the end of the day who cares both CPU's have a place and a consumer with even slightly below average intelligence should be able to figure out which one is right for them. There are also a half dozen maybe more decent enough options form both companies.....
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,035 (1.28/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Certain parts of constructing testing methology will always be criticized as there is leeway in deciding how things are tested and some of that is subjective. That's why it's important to read multiple reviews from known professionals. Even if you don't agree with their results, it's important to have another valid data set. For example, I absolutely hated Tomshardware's last GPU testing suite game selection due to the number of performance outliers that were in it compared to other testing suites. I in general would prefer to limit outliers as much as possible unless the data set is very large.
Absolutely which is why they're only one of the places I get my data from.

Specifically I really like TPU for the consistent testing, and very open communication around changing test conditions and mthodology, be it hardware / software or anything else.

HUB by comparison appear to not really follow a standard test suite of sorts, and each video they make is a new methodology and test suite of games depending on what they're trying to show/achieve. And again I don't doubt the accuracy of the numbers presented are what they got during testing, but it leaves a hec of a lot more room to construct the test with a desired outcome already in mind, which is sometimes obvious, at least to me, especially when I've stuck around for a few choice cringe monologues at the end. It's also pretty obvious that he loves riling up fanboys, all of them, doesn't matter what they support, he'll gladly fan the flames, literally admitted it in a YT comment to me.
 
Joined
Jul 13, 2016
Messages
3,045 (1.03/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Steve isn't going to make a benchmarking video of him with his music and twitch stream open and post the fps he's getting then with 14900K vs 7800X3D. Hell he doesn't even use MSI boards on AM5 because he knows Aorus/Gigabyte have the best AGESA timings and give him 10% more FPS with the Cl30 6000 kits.

I want to see benches with stuff open in the background:
View attachment 325384

When they used to do it with alder lake (optimum PC did one) and a few others as well, and (not surprisingly) the CPU with more cores won, even if they weren't pegged.

But his point stands either way... the 14900K is kind of a shite product $/performance if you're only gaming. If you're a streamer though.... eh.? maybe? I doubt we will see that review tho.

They did one a while ago: (300 tabs open)
View attachment 325386

The vast majority of twitch streamers use the video encoder built into the GPU which has minimal CPU overhead. For those that go a step above that, the general recommendation is to get a separate system that handles the encoding. It's never recommended to do CPU encoding on the system you are also streaming because that introduces the possibility of inconsistency and you typically have to make sacrifices in performance in exchange for higher core count CPUs. If you are upgrading from GPU accelerated encoding, it typically means you want high quality and many people opt for threadripper so they can crank up the encoding present to increase quality while maintaining bitrates supported by major streaming services. This is probably why it doesn't make a ton of sense to have a test demonstrating performance of CPU based encoding on the system you are playing the game on, it's not a scenario anyone should find themselves in. That user should just enabled GPU accelerated encoding if they don't have a separate system. The stream will be smoother as a result and modern GPUs can output good quality video (and even more so once services start supporting AV1).
 
Joined
Dec 25, 2020
Messages
5,550 (4.21/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Both of these CPUs have outstanding performance. Nothing changed whatsoever from the old video where they pitted it against the 13900K... after all, the 14900K is the exact same processor. In terms of raw grunt, the i9 is obviously the superior processor. This was true when the 7800X3D (and even the 7950X3D) was pitted against the 13900K, it remained true when the i9-13900KS was released for the few crazy folks who bought one, and it remains true now, with the i9-14900K (which aims to bring that improvement to everyone else)... after all there is absolutely zero, zilch, nada changes between these three CPUs other than their binning and factory clocks (13900KS > 14900K > 13900K for binning, 14900K > 13900KS > 13900K for default clocks), but it only paints a story we always knew: there's no absolute best regardless of workload anymore. And that's fine, it's the only reason you pay $400-800 on a flagship CPU instead of $1000-1800.

Productivity applications tend to reflect the i9's extra grunt, while games, being weird and memory bound most of the time, react positively to the X3D. This is fine, and this is not news. I believe the Ryzen, being cheaper, is a no brainer for gamers who don't look for an extra bit of kick, they just want a no nonsense, fast gaming processor, buy the X3D... otherwise get the Intel chip and don't look back. Socket AM5's gravest, early release issues with AGESA seem to be mostly ironed out, so with a BIOS update, I think most people will be satisfied with the Ryzen's stability by now.

That's my stance as a 13900KS owner anyway.

Because that's the point of the video. "A head to head between AMD's and Intel's best gaming processors" is the framing. It's not about "best CPU in this range" it's "Best gaming CPU money can buy", and it just so happens that a $3xx part beats a $5xx part.

I mean, there's always the i7-14700K. It's another very viable proposition priced similarly to the 7800X3D in general. Not a half-bad CPU to have for the price.
 
Joined
Nov 11, 2016
Messages
3,288 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Well Intel released a crappy gen, even Intel know that themselves LOL.

Too bad I ran out of relative to give my old PC to, otherwise I would love to get a 7800X3D
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,035 (1.28/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Too bad I ran out of relative to give my old PC to, otherwise I would love to get a 7800X3D
You can always adopt a 36 year old Australian son :D
 
Joined
Jan 14, 2019
Messages
10,831 (5.34/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Intel has a brand and marketing problem, a problem that they created themselves.

They built a reputation for "best gaming performance at all costs" and now that they've lost that one metric that they could always boast about their entire lineup looks embarrassingly overpriced. It doesn't help that AMD is beating them with a CPU that has 66% less cores and uses less than half as little power in gaming. People always used to say "I only care about gaming" back when Intel had the advantage even though that's never really true and now their only option is to either beat AMD by a good margin or undo all that marketing they used to build their image upon.
I'm still saying that I only care about gaming. That's why I had an i7-7700, then an i7-11700, and now a 7800X3D. Intel's 12-14th gens are just as much trash for me as anything AMD was between circa 2008 and 2018.

So any test is free to include productivity tests, I'm not gonna read those pages, as I don't care if the 14900K runs miles around my 7800X3D in a program that I'll never use while consuming 300 Watts.

Am I biased? Hell yeah! I'm biased towards gaming. Whoever gives the best gaming performance with the lowest power consumed at the lowest price wins. I don't care whether the box of the product is red or blue. It's that simple. :)
 
Last edited:
Joined
Nov 29, 2022
Messages
742 (1.21/day)
Processor Intel i7 77OOK
Motherboard Gigabyte Aorus something
Cooling Noctua NH-U12S dual fan
Memory Ballistix 32 Go
Video Card(s) MSI 3060 Gaming X
Storage Mixed bag of M2 SSD and SATA SSD
Display(s) MSI 34" 3440x1440 Artimys 343CQR
Case Old Corsair Obsidian something
Audio Device(s) Integrated
Power Supply Old Antec HCG 620 still running good
Mouse Steelseries something
Keyboard Steelseries someting too
Benchmark Scores bench ? no time to lose with bench ! :)
Maybe the video should have been labelled : "Gaming ? Choose a 7800X3D (450 €) instead of a 14900K (800 €) and invest the 350 € left in a better GC"
(price checked this morning in France (in a expensive e-shop))

The price between the 2 is a big factor here.
If they were at the same price ... choose your prefered color, red or blue.

But 350 € more is a big bump in the CG budget, and as gamer the choice is obvious no ?

Episode 5 Nbc GIF by Law & Order
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,035 (1.28/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
about the only gaming pro I saw mentioned online for the 14900K (this video riled up a few, and Steve took to twitter/X to ask for pro's), is that because of the productivity chops, it stands to reason that games that have lengthy shader pre-compilation waits, it should process them faster.

I'd take all the pro's of the 7800X3D against that any day, but it would appear to be a valid pro.
 
Joined
May 31, 2016
Messages
4,410 (1.48/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Maybe the video should have been labelled : "Gaming ? Choose a 7800X3D (450 €) instead of a 14900K (800 €) and invest the 350 € left in a better GC"
(price checked this morning in France (in a expensive e-shop))

The price between the 2 is a big factor here.
If they were at the same price ... choose your prefered color, red or blue.

But 350 € more is a big bump in the CG budget, and as gamer the choice is obvious no ?

Episode 5 Nbc GIF by Law & Order
Simple thing isn't it? If the performance is very close for all those things you do, check the price and decide :)

For the is HWUB being objective question. I think they are. they are testing gaming capabilities for the CPUs. People mentioned 4k and mid range cards. Well, if you want to play at 4k you dont need that much CPU power obviously. That premise is a bit out of place here. Since HWUB is trying to find out, which processor is the best for gaming not which can run 4K on a mid range card since that would be vast majority of the CPU currently on the market.
 

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,706 (2.38/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
The opening lines of this video (which give the extact context of the comparison) are these:

Welcome back to Hardware Unboxed. Today we’re taking an up down(?) look at the gaming performance of the core i714900k and ryzen 7800X3D.

Given that the OP is contextually misleading, I'm closing down this thread. The mods get grief for these actions but the point is, a thread was started with no intention other than to cause - not discussion - but argument. The OP mentions how unfair it, is, how unobjective it is, and goes on to talk about productivity performance. I'll ask you to reread what I quoted. The segment (as has been mentioned by other members) was filmed with an eye to the best gaming processor between the two brands bestselling chips (Steve's words).

The video is not choosing to compare the overall performance. That is clear from the video. What we have here, is a thread created (at best) out of ignorance of the context, or (at worst) to stir things up. Either way, threads need to start on a good footing to be relevant and worth discussion. Threads started with initial bias using out of context subject matter are poor. There's be nothing wrong with starting a thread weighing up both CPU's on their relative merits, but in this case, with the entire 'rant' attitude of the OP, it's flamebait.

Closed.


EDIT:

We decided to reopen the thread, because the discussion has been quite civil so far, despite the misleading thread title. That's why the thread has been reopened but the title was adjusted.

If discussion goes out of hand we will close the thread.
 
Last edited by a moderator:
Joined
May 24, 2023
Messages
809 (1.85/day)
Because that's the point of the video. "A head to head between AMD's and Intel's best gaming processors" is the framing. It's not about "best CPU in this range" it's "Best gaming CPU money can buy", and it just so happens that a $3xx part beats a $5xx part.
Both CPUs can be used for gaming, but in terms of comparability, these chips are completely different (one has a tripple count of the cores than the other, price, locked/unlocked, frequencies).
...
I do however think Steve adds far too much personal flair to the whole thing for my liking, his derogatory comments and general attitude is inflammatory and borderline trollish. I don't think he'll ever regret it though, it's been his style for years, and it's his platform...
That is why I wrote about vibe of the video, not everything is in words and numbers. Steve had a troll gleam in his eyes... :)
There's no sense in power limiting a 14900K when the 7950X3D is going to beat a power limited 14900K it in every aspect including power consumption. Both CPUs are the same price and have about the same performance out of the box.
Limiting power / frequencies of both 14900K and 7950X makes perfect sense to prevent overheating and to increase efficiency.

I have no problem accepting that 14900K will have a bit (if I remember correctly it is around a third) lower power efficiency caused by the old process it is made on, and that it is slower for gaming (I have a mid tier graphic card and I am limited by it, so it is not slower for me).

I view a 14900K as a building set you can use to build a CPU you want. I do not like how it is set out of the box.

I also do not like how AMD CPUs have thick small heatspreaders, are hard to cool and are set to hit temperature limits.

I also do not like that you do need to buy the frame for Intel CPUs, because the stock socket mounting mechanism really bends them. On the other hand, they have normal thin heatspreader and are easy to cool even at pretty high power draw, you really do not need to mod them (delid, etc) in any way.

I do not like idle power draw of AMD CPUs.

I have no doubt that the second gen of AM5 chipset/motherboards will be much more mature than the first gen. I think that the last gen LGA1700 is easier go get going and more reliable than the first gen AM5 boards. I think that X670E and 7800X3D are just a prelude to what AM5 can do this year. 7800X3D is not the best thing ever, it is a worst kind of gaming CPU for AM5 socket. (I am cunningly waiting for better things to come while surviving on an adjusted 14900K)

You can think about such stuff as above quite rationally and objectively without any gleam in your eyes.
 
Joined
Jan 14, 2019
Messages
10,831 (5.34/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
7800X3D is not the best thing ever, it is a worst kind of gaming CPU for AM5 socket.
I agreed with your post up to this point, but this is just... no. Absolutely no. I'm curious what you base this opinion on, though. Have you ever tried one?
 
Joined
Aug 12, 2019
Messages
139 (0.08/day)
Location
Poland
Processor R5 5600
Motherboard MSI B450M Mortar Max
Cooling SPC Fortis 3
Memory Crucial Ballistix Sport LT 3000 2x8
Video Card(s) MSI Gaming Trio RTX 3070
Storage Lexar Pro NM760 1 TB, Corsair MP510 960 GB
Display(s) Dell U2412M
Case be quiet! Pure Base 500
Power Supply EVGA G2 750W
Mouse Logitech G500
Keyboard SPC Gear GK550 Omnis Kailh Brown RGB
I will also add however that while I don't doubt the numbers he gets, he gets certain numbers by carefully crafting the testing methodology to show the result that he wanted to show all along. For example some 8GB VRAM stuff recently, some beautifully constructed tests to purposely cripple 8GB cards, largely using scenario's no sane person would, just to make his uhh... point.
What kind of crafted scenarios, can you point to any examples?
 
Joined
Jun 2, 2017
Messages
8,403 (3.21/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Both CPUs can be used for gaming, but in terms of comparability, these chips are completely different (one has a tripple count of the cores than the other, price, locked/unlocked, frequencies).

That is why I wrote about vibe of the video, not everything is in words and numbers. Steve had a troll gleam in his eyes... :)

Limiting power / frequencies of both 14900K and 7950X makes perfect sense to prevent overheating and to increase efficiency.

I have no problem accepting that 14900K will have a bit (if I remember correctly it is around a third) lower power efficiency caused by the old process it is made on, and that it is slower for gaming (I have a mid tier graphic card and I am limited by it, so it is not slower for me).

I view a 14900K as a building set you can use to build a CPU you want. I do not like how it is set out of the box.

I also do not like how AMD CPUs have thick small heatspreaders, are hard to cool and are set to hit temperature limits.

I also do not like that you do need to buy the frame for Intel CPUs, because the stock socket mounting mechanism really bends them. On the other hand, they have normal thin heatspreader and are easy to cool even at pretty high power draw, you really do not need to mod them (delid, etc) in any way.

I do not like idle power draw of AMD CPUs.

I have no doubt that the second gen of AM5 chipset/motherboards will be much more mature than the first gen. I think that the last gen LGA1700 is easier go get going and more reliable than the first gen AM5 boards. I think that X670E and 7800X3D are just a prelude to what AM5 can do this year. 7800X3D is not the best thing ever, it is a worst kind of gaming CPU for AM5 socket. (I am cunningly waiting for better things to come while surviving on an adjusted 14900K)

You can think about such stuff as above quite rationally and objectively without any gleam in your eyes.
"I have no problem accepting that 14900K will have a bit (if I remember correctly it is around a third) lower power efficiency caused by the old process it is made on, and that it is slower for gaming.

"(I have a mid tier graphic card and I am limited by it, so it is not slower for me)." So a 14900K instead of a mid range cpu for a mid range GPU).

You are probably of the opinion that Vcache only makes a difference when using something like a 4090.


"I do not like idle power draw of AMD CPUs."


You said this in the same thought about accepting Intel power draw, even though it is higher.

I do not watch Hardware Unboxed anymore. I have 2 of the most maligned parts by them in my PC the 7900XT and the 7900X3D. I have heard him call both of those worthless.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,153 (2.88/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Right, but I can't control the software, I can only buy the hardware that alleviates those issues. I'm not going to re-factor the game.
Actually you can. It's called with your wallet. Don't buy software that's half baked. It's really that simple. Don't by something shiny just because it's shiny.

I also do not like how AMD CPUs have thick small heatspreaders, are hard to cool and are set to hit temperature limits.
Bruh. You do realize that the 14900k basically jumps on the thermal limiter unless you get something like a 240mm AIO at least. You really can't talk about temperature and expect the 14900k to be winning any brownie points.
 
Last edited:
Top