• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen 9 7950X3D

I completely agree with you (surprisingly). That's why I find out of the box efficiency comparisons completely dumb. Efficiency comparisons should be done at equal wattage,, in which case sure, the 7950x is more efficient (only in mt workloads) than the 13900k, but not by much. Best case scenario is 19% in vray, but usually it's around 10-15% in most workloads. On the other hand the 13900k is much more efficient in mixed workloads. Gaming is the only task where the 13900k stinks, but frankly so does the normal 7950x.

Out of the box efficiency is relevant for most people, as most people don't tweak settings much, if at all.
 
Out of the box efficiency is relevant for most people, as most people don't tweak settings much, if at all.
Great,then intel wins hands down with their t and non k lineup of skus
 
As I said, just attack on wallets. It remains only efficient, which is not really big in gaming. Overall, has the most disastrous performance/$ and is not a big deal. Maybe 7700x3d will change the image, because 7950x3d is far from the impact of 5800x3d's launch.

games.jpg

app.jpg
 

Attachments

  • games.jpg
    games.jpg
    374.6 KB · Views: 96
Last edited:
No they don’t. The 13900k (why is the 13900KS never tested?) still outperforms it in many applications and games! All they did is add more cache than Intel. Intel has the better design. If all thing were equal, same cache, the Intel chip would slaughter the AMD.
Imagine the 12900k or 14900k with 90mb of cache at 5.8ghz.. it will be unbeatable
 
Still many stupid comments like "oh it's not winning much compared to my 13900K so I won't swtich to it".
Of course you won't switch, icebrain!

"Upgrading from the platform you already have is always cheapest".
This is the golden rule which applies oppositely as well.
If I already have AM5 platform, it's also not worthy to swtich to Intel platform at all.

However you'll always find Intel fanboys are emphasizing the "CPU" is not worthy for switching "whole platform". Which is funny. How hard to not be a sore loser?
 
@W1zzard Any chance on retesting cyberpunk with RT? Most likely FG was on for the 3d results, Ive tested the 3d and its doing worse than the 13900k in that game
 
@W1zzard Any chance on retesting cyberpunk with RT? Most likely FG was on for the 3d results, Ive tested the 3d and its doing worse than the 13900k in that game
 
Not bad. Though the amd cpu is quite close to maxing out its memory speed at 6000mhz, with only few being able to achieve speeds of 6200 or 6400, while pretty much every unlocked 13th gen intel cpu can do 7600mhz, so benchmarking the intel raptor lake processors at 6000mhz is leaving 20% of frequency on the table.
 
That's a laptop cpu. In desktop Intel is king of efficiency
False. Pure single CPU thread is not real world use case.

Not bad. Though the amd cpu is quite close to maxing out its memory speed at 6000mhz, with only few being able to achieve speeds of 6200 or 6400, while pretty much every unlocked 13th gen intel cpu can do 7600mhz, so benchmarking the intel raptor lake processors at 6000mhz is leaving 20% of frequency on the table.
Tighter memory timings has greater benefits for Zen 4.

Hardware UnBox's Spiderman benchmark shows 129 fps to 140 fps improvements via easy memory tightening methods. 3D Cache can reduce the need for tighten memory timings.
 
As I said, just attack on wallets. It remains only efficient, which is not really big in gaming. Overall, has the most disastrous performance/$ and is not a big deal. Maybe 7700x3d will change the image, because 7950x3d is far from the impact of 5800x3d's launch.

View attachment 286313
View attachment 286311
Regardless of your sentiment I am totally enjoying my 7900X3D and not missing my 5800X3D. It is sad that nothing of merit will change the position of you or some other users in an AMD thread. Cherry picking in a world where CPU performance is nuanced based on software compatibility. As afara s an attack on wallet DDR5 is not more expensive than DDR4 that can keep up with it and is in some cases cheaper.MBs are only expensive if you want the candy.
 
Cause you are watching unlocked with unlimited power 13900k reviews. There are a ton of Intel skus that are more efficient than AMD's best, like 13900t,13700t,12900t, 13900,12900,13700 etc. And the list goes on
Intel"s unlimited power is Intel's attempts to top bench charts.

Desktop Zen 4 has configurable TDP settings i.e. X SKU can act like non-X SKU with a single BIOS drop down option.
 
Intel"s unlimited power is Intel's attempts to top bench charts.

Desktop Zen 4 has configurable TDP settings i.e. X SKU can act like non-X SKU with a single BIOS drop down option.
Yeah but people keep telling me that 99% of users run stock out of the box,they don't go into the bios. In which case intel has the most efficient skus
 
ements via easy memory t
False. Pure single CPU thread is not real world use case.


Tighter memory timings has greater benefits for Zen 4.

Hardware UnBox's Spiderman benchmark shows 129 fps to 140 fps improvements via easy memory tightening methods. 3D Cache can reduce the need for tighten memory timings.
Tightening timings shows great improvements on pretty much all platforms, meaning both intel and amd. The 3d chips won't benefit as much unless the game doesn't fit in the cache.
 
Last edited:
On this page, AMD Ryzen 9 7950X3D Review - Best of Both Worlds - Power Consumption & Efficiency | TechPowerUp, under Gaming Efficiency you refer to "Frames Per Watt" yet in the detail you show FPS not FPW.

Also I'd like to see graphs of the most efficient, PL1/PL2 curve, so I can know how to best set them for each CPU. For example is 65/95 the most efficient for the 13700k, or 95/125, 125/175 etc. I'm sure as you turn up the power it gets less efficient. I heard Derbauer said 95watts was very efficient for 13900k.
 
Last edited:
Regardless of your sentiment I am totally enjoying my 7900X3D and not missing my 5800X3D. It is sad that nothing of merit will change the position of you or some other users in an AMD thread. Cherry picking in a world where CPU performance is nuanced based on software compatibility. As afara s an attack on wallet DDR5 is not more expensive than DDR4 that can keep up with it and is in some cases cheaper.MBs are only expensive if you want the candy.
Your declared history: unknown -> 5900X -> 5800X3D -> 7900X3D (three processors in 2 years). All to feed the "beast" 6500XT with steroids. Don't start with productivity because your stated history contradicts you. Don't trade the 5900X for the 5800X3D for productivity.
For me it's simpler: 10500 + 12500 + 13500 = cheaper than 7950X3D and an idea more expensive than 7900X3D.
Using only a 3070Ti, by no means the "beast" 6500XT, I take the weakest of them (10500) and attach game captures. What do you say, can I play decently? Because in the end only this matters: you can or cannot play decently with the money invested, which is much less than an X3D and can be used for other purchases.
If the RTX 4090 barely surpasses the competition marginally (to zero), do you think that most gamers will feel the impact of these processors apart from an empty wallet? Even with the RTX 4090, do you think anyone will feel the impact? The RTX 4090 is worth it for 4K, where an 7950X3D barely separates by 2% from a 13600K!!!!!!

Assassin's Creed  Odyssey Screenshot 2023.03.02 - 17.35.45.05.png

wotencore_02_03_2023_20_04_58.png

Cyberpunk 2077 Ultra.png
Shadow of the Tomb Raider sync 100Hz.png

Shadow of the Tomb Raider sync off.png
 
Your declared history: unknown -> 5900X -> 5800X3D -> 7900X3D (three processors in 2 years). All to feed the "beast" 6500XT with steroids. Don't start with productivity because your stated history contradicts you. Don't trade the 5900X for the 5800X3D for productivity.
For me it's simpler: 10500 + 12500 + 13500 = cheaper than 7950X3D and an idea more expensive than 7900X3D.
Using only a 3070Ti, by no means the "beast" 6500XT, I take the weakest of them (10500) and attach game captures. What do you say, can I play decently? Because in the end only this matters: you can or cannot play decently with the money invested, which is much less than an X3D and can be used for other purchases.
If the RTX 4090 barely surpasses the competition marginally (to zero), do you think that most gamers will feel the impact of these processors apart from an empty wallet? Even with the RTX 4090, do you think anyone will feel the impact? The RTX 4090 is worth it for 4K, where an 7950X3D barely separates by 2% from a 13600K!!!!!!

View attachment 286621
View attachment 286622
View attachment 286623View attachment 286624
View attachment 286625
First of all the 6500XT was used in my HTPC to satisfy my curiosity surrounding the hatred shown to the card. I sell PCs as a hobby. My 5800X3D and now 7900X3D are driving my 7900XT and yes I got the 7900X3D because I missed the snappiness of 12 cores. There is an interesting find that I have made. IF you OC your GPU it can push the CPU to the limit. TWWH3 is a great study for these new CPUs. When you are in battles and surfing around the map it is GPU but when you end your turn the other CCD takes over as I can see 5.2 GHZ but my Noctua 3000 will spin right up to the max as my Overlay shows A CPU temp of 82 so the 2nd CCD is working.

The industry is caught up on FPS and the secret to the X3D chips is smooth Gameplay. There is also the fact that my 7900X3D uses less power than that 13500 and has double the number of real cores and a higher base and boost clock so. We don't have 7800X3D chips to compare but we do have these monsters and if you don't think your wallet will be hit as hard you are joking. The difference between a 7900X3D and a Asus Strix E board vs a 13900k and Asus Strix E board is about $5 in cost.

Where my chip is better is also PCIe allocation. I don't have to worry about losing lanes if I put an M2 drive in the slot that it was made for. I could continue but I don't want this thread to be hijacked the way yourself and a few others like to opine on AMD threads with why Intel is "better".
 
Yeah but people keep telling me that 99% of users run stock out of the box,they don't go into the bios. In which case intel has the most efficient skus

They do not.

Because Intel Mobos run CPUs at no power limit out of the box.
 
As I said, just attack on wallets. It remains only efficient, which is not really big in gaming. Overall, has the most disastrous performance/$ and is not a big deal. Maybe 7700x3d will change the image, because 7950x3d is far from the impact of 5800x3d's launch.
Attack on the wallets?? You do realize that electricty costs money!? Greater efficiency equates to lower power and electricity usage. If you buy a 13900k for $620 and then spend an additional $20/year on electricity, over 5 years your total cost will be greater. Or let me guess, you don't pay for electricity because you are living at home...

I attached calculations showing the estimated cost using the 13900k and 7950X3D average power in gaming (143w vs 56w).

Total Cost:
13900K: $620 + $20/yr extra * 5 years = $720
7950X3D: $699
 

Attachments

  • Cost_Calculator_13900k.png
    Cost_Calculator_13900k.png
    45.4 KB · Views: 102
  • Cost_Calculator_7950x3d.png
    Cost_Calculator_7950x3d.png
    34.8 KB · Views: 79
Last edited:
under Gaming Efficiency you refer to "Frames Per Watt" yet in the detail you show FPS not FPW.
It is a little bit of a simplification to make it more accessible for readers. The underlying physics checks out imo, happy to have a discussion on that if you want.
 
Emulation results are very different from all of your past tests. Even compared to 13900K review. Also I don't get the gane change into "Super Mario Kart 8". Can you please elaborate more on these results? Cache never mattered between 5800X3D and 5800X in past test and suddendly 7950X3D tops the carts even if 7950X was far behind the competition before.
I also advice using Pokémon Scarlet/Violet with FPS unlock mod for testing, as it's the heaviest Switch game right now by far.
 
Attack on the wallets?? You do realize that electricty costs money!? Greater efficiency equates to lower power and electricity usage. If you buy a 13900k for $620 and then spend an additional $20/year on electricity, over 5 years your total cost will be greater. Or let me guess, you don't pay for electricity because you are living at home...

I attached calculations showing the estimated cost using the 13900k and 7950X3D average power in gaming (143w vs 56w).

Total Cost:
13900K: $620 + $20/yr extra * 5 years = $720
7950X3D: $699
Where did you get those numbers from?
( https://www.techpowerup.com/review/amd-ryzen-9-7950x3d/24.html )

7950X3D ($700) vs. 13900K ($580)
Power Consumption Single Threaded
37w vs. 33w
Power Consumption Multi-Threaded
140w vs. 267w
Power Consumption Applications (47 Tests Average)
79w vs. 169w
Snag_5558cfe.png
Snag_555b333.png


$700+(31.33*5yr) = $856.65
$580+(67.02*5yr) = $915.01
...so if you cut $60 of RGB out of your Intel build you will probably be on par then it doesn't matter which CPU you pick as long as you keep your CPU for 5 years. (that was a mild attempt at humor)

Power Consumption Gaming (13 Tests Average)
56w vs. 143w
1678200479831.png
1678200501349.png
 
Last edited:
I was just re-reading this review and I've decided to buy a couple 5800X3Ds for my personal machines.

The 7950X is likely to be the best sub-200W gaming CPU for a considerable while, and honestly with the 5800X3D at 88% of the performance, I'm more than happy to keep my AM4 boards with their DDR4-3600 and just flip the 3600XT and 5800X on ebay. My net cost after that is probably going to be lower than a single B650 motherboard!

I'm using AM5 at work as a validation platform and it just feels like 1st Gen Ryzen all over again with constant firmware and OS patches to get compatibility and bugs ironed out. With the Ryzen 1000-series at least, it was exciting to get competition back in the CPU market, and to be given 8C/16T after so many years of Intel screwing over mainstream platforms with 7 successive generations of yawn-inducing quad cores. Pricing of Zen1 was also very appealing so it was easy to overlook the constant AGESA updates and patches to improve compatibility with DDR4 and iron out weirdness with W10's scheduler.

Now, with the high price of DDR5, AM5 boards, and the fact that it's not quite such the generational performance leapfrog from AM4 to AM5, I'm simply not interested enough to jump on Ryzen 7000 at home. I'll gladly take my PCIe 4.0 B550 and X570 and use AM4's swansong CPU to push those systems well into 2024. DirectStorage still isn't here in any beneficial way, PCIe 5.0 SSDs clearly aren't ready, affordable, or consumer-ready, and W1zzard just proved that an RTX 4090 is still reasonably happy in a PCIe 2.0 motherboard slot, not that I intend to buy a 4090...
 
No they don’t. The 13900k (why is the 13900KS never tested?) still outperforms it in many applications and games! All they did is add more cache than Intel. Intel has the better design. If all thing were equal, same cache, the Intel chip would slaughter the AMD.
Intel having the "better design" is laughable. Throw obscene amounts of power to hit high IPC isn't a better design. "If all things were equal" well guess what, they're not.
 
Intel having the "better design" is laughable. Throw obscene amounts of power to hit high IPC isn't a better design. "If all things were equal" well guess what, they're not.
I'm not going to defend Intel, but that fact that you *can* throw 350W at a 13900K to get performance is at least something. You can't do that to AM5; AMD's design onTSMC 5nm simply doesn't handle additional power the same way.

Whether you like the horrific, inefficient, wasteful 13th Gen flagships or not, they do hold the performance crown. You can probably tell which side of the fence I'm sitting on but I can't deny the facts.
 
I'm not going to defend Intel, but that fact that you *can* throw 350W at a 13900K to get performance is at least something. You can't do that to AM5; AMD's design onTSMC 5nm simply doesn't handle additional power the same way.
That's the benefit of having your own fabs too. You can optimize the process to fit your products. The 13900k scales better with more power after about 140 W. TSMC's biggest customers are focused on low power devices. The table below is condensed from the Computerbase.de review of the 13900k. It shows multi-core performance normalized to Core i7-13700K at a PL1 and PL2 of 65 W.

CPU45 W65 W88 W125 W142 W181 W230 W241/253 WUnlimited
Core i9-13900K92%117%135%153%159%184%186%
Ryzen 9 7950X84%130%154%180%189%

Like you, I prefer decent performance at sane power levels, but the superior power scaling of Intel's process at high power levels is undeniable.
 
Back
Top