• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 7 9700X

Joined
Nov 11, 2016
Messages
3,399 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
I had to go check some of the individual benchmarks and cross compare a little, and yeah that is exactly what is going on. Cyberpunk 2077 for example, without RT both Tom's and TPU show the 9600X performing very well - beating a 14700K by 8-13% depending which site does the test.

So they're not really in conflict, just none of them have a particularly large sample size of games and Zen 5 performance seems to vary pretty wildly from one game to the next.

HUB read our minds and posted a new video talking cherry picking games ;)

I have always liked those 50 games benchmarks that HUB and TPU did, no room for cherry picking
 
Joined
Jan 27, 2015
Messages
1,715 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Not true.

Ancient Replays and Level1Tech did test with a 7900XTX besides the 4090 and obtained different and interesting results.

Same for doing benchmarks using Linux instead of Windows.

The point is, the magical 4090 is not the end all, be all.

ComputerBase tests with a 7900 XTX and a 4090, and the results were not different in any significant way.

As far as testing with a more midrange card, well there are sites that do that too.

HotHardware uses a 4070 Super. Guess what their tests look like?

Obviously in this environment, winners and losers are more a result of chance and margin of error than actual performance advantages.

1723470738002.png


So I will say I think there is a place for this type of hardware being used in a review, however it is with budget CPUs and budget memory / motherboards.

The upper midrange CPUs are still too fast to differentiate with anything but high end GPUs.

That might change with the next gen of GPUs, as it appears they are advancing much faster than this gen of CPUs will.
 
Joined
Feb 28, 2024
Messages
66 (0.25/day)
ComputerBase tests with a 7900 XTX and a 4090, and the results were not different in any significant way.

As far as testing with a more midrange card, well there are sites that do that too.

HotHardware uses a 4070 Super. Guess what their tests look like?

Obviously in this environment, winners and losers are more a result of chance and margin of error than actual performance advantages.

So I will say I think there is a place for this type of hardware being used in a review, however it is with budget CPUs and budget memory / motherboards.

The upper midrange CPUs are still too fast to differentiate with anything but high end GPUs.

That might change with the next gen of GPUs, as it appears they are advancing much faster than this gen of CPUs will.
Thanks for the review tips, I will have to read them when I'm not at work and can spend more than a few minutes.

I wish they'd test at 1440 at the least though, I can only speak for myself, but I did not spend a grand on a video card to play at 1080, so 1080 results don't tell me anything other than theoretical differences in a lab environment, they don't tell me what the effect would be on anything I'd actually be playing.... I like that TPU even does 4K. Although, I think your statement on upper midrange CPUs is good advice.
 
Joined
Apr 30, 2020
Messages
985 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
ComputerBase tests with a 7900 XTX and a 4090, and the results were not different in any significant way.

As far as testing with a more midrange card, well there are sites that do that too.

HotHardware uses a 4070 Super. Guess what their tests look like?

Obviously in this environment, winners and losers are more a result of chance and margin of error than actual performance advantages.

View attachment 358588

So I will say I think there is a place for this type of hardware being used in a review, however it is with budget CPUs and budget memory / motherboards.

The upper midrange CPUs are still too fast to differentiate with anything but high end GPUs.

That might change with the next gen of GPUs, as it appears they are advancing much faster than this gen of CPUs will.
Oddly enough the notes on marketing slide point the use of not one 7900 XTX in the system but two 7900xtx in the system, which odd in self as I believe the list of games AMD they only like one or two games can even even use mutli-graphics card for things like mGPU.
 
Last edited:
Joined
Feb 28, 2024
Messages
66 (0.25/day)
Checked out the HU's clarification video, Computerbase and HotHardware tests.

Computerbase tested at 1280x720, where the 7900xtx beat the 4090 like a red-headed stepchild, which shouldn't happen - so all that tells me is the 4090 is not well optimized for ultra-high framerates at a resolution no one ever plays it IRL.

HotHardware's test with the 4070S showed pretty much what I thought and why I suggested at least testing a few games with a lower CPU - their test showed if you have a midrange GPU, you really shouldn't be worried about your CPU and should save your money for a GPU if it's not fast enough for you.

HU gave props to TPU for a good selection of titles.

Still nothing tells me how CPU performance affects my 7900xtx at resolutions and settings people who spend a grand on a GPU actually play games at.
 
Joined
Jan 27, 2015
Messages
1,715 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Checked out the HU's clarification video, Computerbase and HotHardware tests.

Computerbase tested at 1280x720, where the 7900xtx beat the 4090 like a red-headed stepchild, which shouldn't happen - so all that tells me is the 4090 is not well optimized for ultra-high framerates at a resolution no one ever plays it IRL.

HotHardware's test with the 4070S showed pretty much what I thought and why I suggested at least testing a few games with a lower CPU - their test showed if you have a midrange GPU, you really shouldn't be worried about your CPU and should save your money for a GPU if it's not fast enough for you.

HU gave props to TPU for a good selection of titles.

Still nothing tells me how CPU performance affects my 7900xtx at resolutions and settings people who spend a grand on a GPU actually play games at.

It is a bit annoying that almost everyone uses a 4090 for testing. You'd think some of these smaller sites would try to grab a niche, do something a little different for their testing.

Techspot did tests that're a bit old now, wish they would update this and make it not just about AMD. For the time, some important things could be gleaned.




But then there's this :


1723511520089.png
 
  • Like
Reactions: SL2
Joined
Sep 3, 2019
Messages
3,503 (1.84/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 220W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (375W current) PowerLimit, 1060mV, Adrenalin v24.10.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2161), upgraded from Win10 to Win11 on Jan 2024
While I'm on telling you how to do your job (I'm kidding and I'm sure you welcome suggestions), it would be AWESOME if someone benched the games with something lower than a 4090 so all of us who don't have a $2000 GPU can see how much CPU choice actually affects us. Something like a 4070ti/7900xt that's closer to what most people have, that might still show a difference... Or even something in the $500 range. Just a thought - it drives me nuts that no one does this. The 4090 tests are great for showing the absolute theoretical differences, but I suspect that doesn't affect most gamers in the same way as the charts indicate.

Sorry if I sound critical, Techpowerup's reviews are awesome and are truly the first I look for when a new component comes out.
You are right, 99% of users do not have a 4090.
But think of it like this...

A review IMHO should always have at least 3 resolutions... 1080p, 1440p, 2160p and I will explain why this has value beyond of the preferred or available res each of us has.
You will notice that as the resolution increases and more load is dumped on the GPU the differences in FPS drop between almost all the CPUs.
So by increasing the resolution you are virtually making the GPU smaller and smaller.

So can you imagine what will happen if you use a real smaller GPU against the 4090 lets say at 1080p?
 
Joined
Dec 6, 2022
Messages
380 (0.53/day)
Location
NYC
System Name GameStation
Processor AMD R5 5600X
Motherboard Gigabyte B550
Cooling Artic Freezer II 120
Memory 16 GB
Video Card(s) Sapphire Pulse 7900 XTX
Storage 2 TB SSD
Case Cooler Master Elite 120
ComputerBase tests with a 7900 XTX and a 4090, and the results were not different in any significant way.

As far as testing with a more midrange card, well there are sites that do that too.

HotHardware uses a 4070 Super. Guess what their tests look like?

Obviously in this environment, winners and losers are more a result of chance and margin of error than actual performance advantages.

View attachment 358588

So I will say I think there is a place for this type of hardware being used in a review, however it is with budget CPUs and budget memory / motherboards.

The upper midrange CPUs are still too fast to differentiate with anything but high end GPUs.

That might change with the next gen of GPUs, as it appears they are advancing much faster than this gen of CPUs will.
So two reviewers I mentioned are wrong because the ones that you mentioned obtained different results that does satisfy your narrative?

ok…
Still nothing tells me how CPU performance affects my 7900xtx at resolutions and settings people who spend a grand on a GPU actually play games at.
I hate that rarely you will see those results on the initial CPU review, but as explained by many the reason is to isolate and concentrate on the CPU since at high resolutions, we are bottlenecked by the gpu.
It is a bit annoying that almost everyone uses a 4090 for testing.
Todays bribed influencers, formerly known as tech reviewers ( with some exceptions) need to comply with ngreedia demands of only showing their free 4090s or they will not get more free halo gpus.
 
Last edited:
Joined
Feb 28, 2024
Messages
66 (0.25/day)
I hate that rarely you will see those results on the initial CPU review, but as explained by many the reason is to isolate and concentrate on the CPU since at high resolutions, we are bottlenecked by the gpu.
As I said previously, I understand why they do this and it is proper scientific method for comparing CPUs, but unfortunately those results don't tell most people if buying a new CPU (or CPU/MB/RAM combo) will help their system run games faster or if it would just be a waste of a lot of money.

Todays bribed influencers, formerly known as tech reviewers ( with some exceptions) need to comply with ngreedia demands of only showing their free 4090s or they will not get more free halo gpus.
I don't think TPU is bribed by Nvidia. They are simply using the fastest GPU to isolate CPU differences. Whether or not you agree with that methodology is another topic, but I don't think there's any nefarious reasoning behind it.

You are right, 99% of users do not have a 4090.
But think of it like this...

A review IMHO should always have at least 3 resolutions... 1080p, 1440p, 2160p and I will explain why this has value beyond of the preferred or available res each of us has.
You will notice that as the resolution increases and more load is dumped on the GPU the differences in FPS drop between almost all the CPUs.
So by increasing the resolution you are virtually making the GPU smaller and smaller.
Sure, but people don't just read reviews out of scientific curiosity about which CPU is fastest in an ideal environment - they read them to decide if spending the money on an upgrade would help their computer run stuff faster, "stuff" being whatever apps they care about whether that's games or MS Outlook or video encoding, etc. While the results are certainly valid for a pure CPU comparison, they don't provide context relevant to the average reader.

So what I can gather from the results is if I have a 4090, I can up my gaming FPS by going to 7800x3d (for example), and if I have a 5600x (also for example), I'm probably wasting my money on a 4090 unless I want to drop a 5800x3d in or get a whole new computer with a current gen CPU. However, if I had, say, a 4060, I would know nothing about whether a CPU upgrade would speed up my games based on what's on the review.

So can you imagine what will happen if you use a real smaller GPU against the 4090 lets say at 1080p?
Oddly, it looks like the 7900xtx spanks it at super low resolutions like 1280x720 (look at the Computerbase review). This is probably driver related IMO, as I doubt Nvidia spends a lot of time optimizing performance for resolutions that low.

I'm not sure why they did this test - most games on Zen 3 "X" CPUs run within a FPS or 3 of each other between CPUs. And Cyberpunk is particularly bad among games for showing CPU differences.
 
Last edited:
Joined
Mar 31, 2012
Messages
860 (0.19/day)
Location
NL
System Name SIGSEGV
Processor INTEL i7-7700K | AMD Ryzen 2700X | AMD Ryzen 9 9950X
Motherboard QUANTA | ASUS Crosshair VII Hero | MSI MEG ACE X670E
Cooling Air cooling 4 heatpipes | Corsair H115i | Noctua NF-A14 IndustrialPPC Fan 3000RPM | Arctic P14 MAX
Memory Micron 16 Gb DDR4 2400 | GSkill Ripjaws 32Gb DDR4 3400(OC) CL14@1.38v | Fury Beast 64 Gb CL30
Video Card(s) Nvidia 1060 6GB | Gigabyte 1080Ti Aorus | TUF 4090 OC
Storage 1TB 7200/256 SSD PCIE | ~ TB | 970 Evo | WD Black SN850X 2TB
Display(s) 15,5" / 27" /34"
Case Black & Grey | Phanteks P400S | O11 EVO XL
Audio Device(s) Realtek
Power Supply Li Battery | Seasonic Focus Gold 750W | FSP Hydro TI 1000
Mouse g402
Keyboard Leopold|Ducky
Software LinuxMint
Benchmark Scores i dont care about scores
Thanks for the reviews and I will wait for the 9950X review.
I see a massive improvement over the predecessor compared with all Zen4 CPUs in non-gaming workloads.
Now I have a better picture of what's next for my 2700X replacement. I am surely going to pick non-3DX to support my work.
 
Joined
Dec 6, 2022
Messages
380 (0.53/day)
Location
NYC
System Name GameStation
Processor AMD R5 5600X
Motherboard Gigabyte B550
Cooling Artic Freezer II 120
Memory 16 GB
Video Card(s) Sapphire Pulse 7900 XTX
Storage 2 TB SSD
Case Cooler Master Elite 120
As I said previously, I understand why they do this and it is proper scientific method for comparing CPUs, but unfortunately those results don't tell most people if buying a new CPU (or CPU/MB/RAM combo) will help their system run games faster or if it would just be a waste of a lot of money.
Bro, did you read what i wrote?

First round of reviews rarely do that.

Usually comes after on following reviews.
I don't think TPU is bribed by Nvidia. They are simply using the fastest GPU to isolate CPU differences. Whether or not you agree with that methodology is another topic, but I don't think there's any nefarious reasoning behind it.
Did I specifically said that TPU is doing that?

Worse, do you really think that there aren't biased “reviewers” out there?

Because if do, then I do have a nice bridge to sell you.
 
Joined
Sep 3, 2019
Messages
3,503 (1.84/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 220W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (375W current) PowerLimit, 1060mV, Adrenalin v24.10.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2161), upgraded from Win10 to Win11 on Jan 2024
As I said previously, I understand why they do this and it is proper scientific method for comparing CPUs, but unfortunately those results don't tell most people if buying a new CPU (or CPU/MB/RAM combo) will help their system run games faster or if it would just be a waste of a lot of money.


I don't think TPU is bribed by Nvidia. They are simply using the fastest GPU to isolate CPU differences. Whether or not you agree with that methodology is another topic, but I don't think there's any nefarious reasoning behind it.


Sure, but people don't just read reviews out of scientific curiosity about which CPU is fastest in an ideal environment - they read them to decide if spending the money on an upgrade would help their computer run stuff faster, "stuff" being whatever apps they care about whether that's games or MS Outlook or video encoding, etc. While the results are certainly valid for a pure CPU comparison, they don't provide context relevant to the average reader.

So what I can gather from the results is if I have a 4090, I can up my gaming FPS by going to 7800x3d (for example), and if I have a 5600x (also for example), I'm probably wasting my money on a 4090 unless I want to drop a 5800x3d in or get a whole new computer with a current gen CPU. However, if I had, say, a 4060, I would know nothing about whether a CPU upgrade would speed up my games based on what's on the review.


Oddly, it looks like the 7900xtx spanks it at super low resolutions like 1280x720 (look at the Computerbase review). This is probably driver related IMO, as I doubt Nvidia spends a lot of time optimizing performance for resolutions that low.


I'm not sure why they did this test - most games on Zen 3 "X" CPUs run within a FPS or 3 of each other between CPUs. And Cyberpunk is particularly bad among games for showing CPU differences.
Ok I'll break it down. I was going for making people think it for a moment but failed apparently

What I meant to say with this is...
As you are increasing resolution and keeping the same CPU/GPU it gets less and less significant what CPU you are using because the GPU is the bottleneck in this situation (keep these bold words on the side for later)

Example:

720p
See the difference?
The 7800X3D performs more than 2.0x against the last CPU on chart
More than double... its huge
1723557331916.png

4K
Notice how smaller the difference is
Now the 7800X3D is about 1.2x better than the last one
+20%
1723557643593.png

------------------------------------------------

Now lets go with different GPUs High to low end

When you decreasing the power of the GPU by testing a mid-range one and then a low end one... it gets less and less significant what CPU you are using because the GPU is the bottleneck
Reminds you anything?
Its exactly the same thing as increasing resolution on the same GPU
Thats why almost no one tests different GPUs on a CPU review. It has less value.

Small example
See how differences between CPUs get diminished as the GPU power drops
And mind you this is on 1080p. At 1440p differences are even smaller and so on
1723558445053.png

And to be honest there is some frenzy-ness going on about what CPU everyone should get for gaming especially for higher res.
Not to be mistaken... I'm not saying go get the cheaper (mind the 1% low too), but certainly you dont need the absolute top one(s) to enjoy gaming at high-er res.
Not even out of the top 10 CPUs.
 
Joined
Jan 18, 2021
Messages
168 (0.12/day)
Processor Core i7-12700
Motherboard MSI B660 MAG Mortar
Cooling Noctua NH-D15
Memory G.Skill Ripjaws V 64GB (4x16) DDR4-3600 CL16 @ 3466 MT/s
Video Card(s) AMD RX 6800
Storage Too many to list, lol
Display(s) Gigabyte M27Q
Case Fractal Design Define R5
Power Supply Corsair RM750x
Mouse Too many to list, lol
Keyboard Keychron low profile
Software Fedora, Mint
As I said previously, I understand why they do this and it is proper scientific method for comparing CPUs, but unfortunately those results don't tell most people if buying a new CPU (or CPU/MB/RAM combo) will help their system run games faster or if it would just be a waste of a lot of money.
CPU reviews tell you whether a GPU upgrade will offer your system a worthwhile uplift. GPU reviews tell you whether a CPU upgrade will offer a worthwhile uplift. If your CPU can only hit, say, 80 FPS in your favorite game, then you won't benefit from a GPU that can output 160. And vice-versa. In short, each piece of hardware has a maximum throughput figure in a given context. That's what reviews are meant to isolate and exhibit--the subject hardware's potential, which incidentally also determines its useful life span. Thus, one must compare results for each major component to get the whole picture.

There's no point in reviewers trying to offer "realistic" CPU/GPU combos in their benchmarks, because everyone's configuration and use case differs, but if you really want to see GPU-bound results in a CPU review, you can often find them under the 4k results at e.g. TPU. W1zzard really does go above and beyond.

It is sometimes easy for us tech enthusiasts to forget that CPU reviews are intended first and foremost for people who are already looking to buy; they're not intended to convince you to replace your perfectly good CPU with one that performs 15% better, or w/e. If you have to choose between two CPUs that are currently on offer at a given price tier, then why wouldn't you want the one with better max throughput in your preferred use case? It will last longer. It will accommodate more/better GPU upgrades. Sure, maybe the slower CPU has other advantages--maybe it uses less power, or costs slightly less, etc--but you can't weigh those properly unless you also know the max throughput.

On a more general note, the era of assuming that gamers are GPU-bound is basically over. Ten or fifteen years ago, when everyone targeted 60 Hz (and when CPU tech was stagnant) it might have made sense to complain about "misleading" CPU-isolated benchmarks. Now it really doesn't. Even single-player games can often be heavily CPU bound, either because of a quirk in the engine (see Starfield) or because the player wants to push three-digit FPS to take full advantage of his fancy high-refresh monitor (and may be willing to dial down visuals to get there). The widespread use of upscaling tech (DLSS/FSR) also tends to push the burden back towards the CPU. Then there's a whole crew of competitive multiplayer gamers, who push absurd frame rates and mostly don't care about graphical fidelity. (These people existed ten years ago, of course, but they were stuck with a decade of what you might call "Sandy Bridge Plus" Intel quad cores, and didn't have fancy high-refresh rate monitors to reduce tearing when pushing stratospheric frames.)

Then there are CPU-intensive strategy games. Lately I've been playing something of a hybrid space-sim/4x called X4 Foundations. That game can look fantastic without stressing most any modern GPU really at all, but good luck finding a CPU that will run at 60 FPS stable once the universe gets cluttered. You don't need a 4090 to find CPU-bound scenarios.
 
Joined
Feb 28, 2024
Messages
66 (0.25/day)
You guys all make some good points. Good discussion!

Did I specifically said that TPU is doing that?
No you did not, but since we are commenting on a review with a 4090, it was easy to assume it might be implied. I guess this is why they say we shouldn't ass/u/me, right? Sorry, my bad.
Worse, do you really think that there aren't biased “reviewers” out there?
Of course there are. But most of the big, well known ones go out of their way to be sure they aren't.
 

doc7000

New Member
Joined
Mar 6, 2022
Messages
26 (0.03/day)
A lot of reviewers have made a lot of good points however some reviewers have missed the plot when it comes to power efficiency, When both the 7700X and 9700X are run at 140 watts the 9700X beats it by a good margin in many workloads.

While yes at 70 watts the zen 4 and zen 5 are similar when it comes to power efficiency however a lot of cpu architectures are pretty efficient at lower power levels. If I am going to run a CPU at 140 watts then that is really where I would want the gains.
 
Joined
Jan 27, 2015
Messages
1,715 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
A lot of reviewers have made a lot of good points however some reviewers have missed the plot when it comes to power efficiency, When both the 7700X and 9700X are run at 140 watts the 9700X beats it by a good margin in many workloads.

While yes at 70 watts the zen 4 and zen 5 are similar when it comes to power efficiency however a lot of cpu architectures are pretty efficient at lower power levels. If I am going to run a CPU at 140 watts then that is really where I would want the gains.

Yeah, it's not like even an Intel user is running their CPU at 200W 24/7. The high power draw is only for relatively short bursts, and it doesn't add up to much.

And to that point, virtually every power efficiency test out there is hopelessly flawed.

The reason is straightforward - they ignore what PCs do 90+ % of their time. And that is - nothing.

They are at, or near, idle. Whether you're browsing here on this forum, or watching a YouTube video, on Facebook / Twitter / Instagram etc - your CPU is mostly idling.

Real workflow analysis of different chips, including the idle times, gives an entirely different perspective on 'efficiency' and 'power use'.

In short, in the real world, Zen is and always has been horribly inefficient due to very high idle power draws.



1723641870467.png


 
Joined
Dec 12, 2012
Messages
773 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
@RandallFlagg

I've seen that before and it didn't make sense to me. If Ryzen chips are consuming so much at idle, how is TPU getting sub-20 W consumption in certain single-threaded applications? They measure the power physically from the 12 V connector. Some CPUs even go below 10 W.

Either way, personally I only care about efficiency in terms of heat output. Idle consumption is irrelevant to me, and I'm happy if the CPU is peaking under 100 W in gaming (so i9 chips are automatically disqualified).
 
Joined
Jan 27, 2015
Messages
1,715 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
I'm not really going to argue a whole lot on this, it's all been done before.

Testing a PC under load and deriving real world efficiency from it is like testing a car's efficiency while doing 1/4 mile wide open throttle runs. Nobody does that in the automotive space because it is, frankly, stupid.

When sites like Consumer Reports or Motor Trend test a car's MPG, they run it through a mixed use route or in CR's case, a track they made specifically for that. It mimics various real-world driving, the vast majority of which is not done at full throttle.

That is what the video reviewer above calls a 'work flow'. His argument is pretty iron clad, especially when you consider that 'work flow' is exactly what happens when measuring efficiency / reliability of anything, ever, whether it's a transformer for a power substation or a car or a commercial truck. For some reason, this common sense methodology has never made it into the PC world. Too much dependence on clicks, too many false narratives.


1723646917610.png
 
Joined
Apr 30, 2020
Messages
985 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
Joined
Dec 12, 2012
Messages
773 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
I'm not really going to argue a whole lot on this, it's all been done before.

It's not my intention to argue, I'd just like to understand it.

So what about this graph? Where does the 20-25 W in those 3 apps come from?
Maybe that high whole system idle power usage is because of the motherboard/chipset, not the CPU?
People often say it's the I/O die that's wasting power, but wouldn't that show up in the EPS connector measurement? Unless the CPU also draws power from the 24-pin connector. The 8500G also has pretty high total system usage, but it doesn't have an I/O die.

 
Joined
Jan 27, 2015
Messages
1,715 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
It's not my intention to argue, I'd just like to understand it.

So what about this graph? Where does the 20-25 W in those 3 apps come from?
Maybe that high whole system idle power usage is because of the motherboard/chipset, not the CPU?
People often say it's the I/O die that's wasting power, but wouldn't that show up in the EPS connector measurement? Unless the CPU also draws power from the 24-pin connector. The 8500G also has pretty high total system usage, but it doesn't have an I/O die.



Every one of those benchmarks is essentially a measure of 'wide open throttle' for its respective task.

But is that how it happens in the real world? No, of course not.

Think about a software developer, and note the compile line on your chart shows ~90W. That chart suggests that anytime I'm coding (the workflow) I'm using 90W? Of course not. Most of 'coding' is reading, writing in a text editor, and looking up 'how' to code something on a browser. That's the *workflow*. It would only use 90W during the code compiles, which is like 1% of the time spent during development (if that).

So none of that represents a real 'workflow'. It represents max performance under a workload for each of those apps, which is typically something that only happens for a small fraction of the time spent on each of those.
 
Joined
Dec 12, 2012
Messages
773 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
You didn't even read my post. I CLEARLY asked about the THREE apps that use 20-25 W.
That's so much lower than the total system idle consumption of 80 W (I know that's before power conversion where some loss occurs). If the CPU is using 20-25 W, what is using the remaining 55-60 W? Is it all the other components in the system?

If so, then I don't really see how a difference of 22 W is relevant. I counted the cost of electricity in my country (where the average income is 3x lower compared to the west). At 12 hours every single day it comes to about $2 per month. In the grand scheme of the total power consumption in my house, it's basically nothing.

The chart from the video shows energy consumption of ~60 kWh per year. That's $15 over here, so even cutting it in half saves just $7.5 per year. I mean sure, one extra pizza per year sounds nice. ;)
 
Last edited:
Top