• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 9 5950X

@ W1zzard Will you ever visit RTX 3090 SLI and/or DX12 mGPU ? to compare 10900K and 5950x ?
when not graphics bottlenecked....
SLI is dead, NVIDIA disabled implicit SLI (the SLI you're thinking of). It will no longer work in nearly all games.

This may be getting picky, but for strategy I would like to see one of the following: TW: Any current release, Cities (sim I know but still in the genre), or Ashes of Singularity.
Umm I don't think anyone actually plays Ashes of the Singularity. It's only used as benchmark. I still feel like Civ the better choice of all those?
 
For the 2021 CPU bench I'll definitely think about other games. But if you had to pick 1 strategy game, wouldn't it be Civ?
As much as I like Civ games, I don't think FPS is that important in turn-based games... so a real time strategy game would probably be more appropriate (IMO)
 
As much as I like Civ games, I don't think FPS is that important in turn-based games... so a real time strategy game would probably be more appropriate (IMO)
That is a great point, but strategy games load the CPU in ways other than typical games, note how multi-core processors do much better here
 
MT Energy Efficiency so insane. Very ideal for a quiet homeserver build. If Asrock pulls Asrock thing and allows X370 to take 5000 series I may consider it for my homeserver upgrade.
 
Great review.
But i disagree when you state High price as negative. For this number of cores and potential its excellent deal
 
Great review.
But i disagree when you state High price as negative. For this number of cores and potential its excellent deal
What's your math for this? Mine is here https://tpucdn.com/review/amd-ryzen-9-5950x/images/performance-per-dollar.png

Now of course if you charge $200 per hour to your clients to get something done, then a faster PC will mean less cost that could quickly make it worth it

why is the temperature on 5800x is in high 70`s and this one is in high 50`s?
5800X is one CCD, so all heat is concentrated on one die, on 5900X it's split between two CCDs
 
As good as the review is as usual, without the 3950X in there is just incomplete data. Don't mean to be picky here, but people like me with a 3950X are the most likely to consider a 5950X upgrade.
 
As good as the review is as usual, without the 3950X in there is just incomplete data. Don't mean to be picky here, but people like me with a 3950X are the most likely to consider a 5950X upgrade.
Unfortunately AMD never sent me a 3950X for review and not worth buying one now that it's obsolete
 
SLI is dead, NVIDIA disabled implicit SLI (the SLI you're thinking of). It will no longer work in nearly all games.

Umm I don't think anyone actually plays Ashes of the Singularity. It's only used as benchmark. I still feel like Civ the better choice of all those?

If you're going to base your rig around what people are buying and playing you might was well not use any of the top end cpu's or gpu's and just build a bench rig that based on performance pre dollar, with game bundles.¯\_(ツ)_/¯

w1zzard that 5950x is one of the sucky ones you got one that's about second to last in over clocking it seems. being 4.3-4.4ghz is the worst clocking ones while the better ones will hit 4.6-4.7ghz
Zen 3 has huge variations on cpu's. Even de8auer said some chips on cold don't like below -60 and other will got -196C crazy how big a difference some chips have
 
They are both same TDP, so don't expect any significant differences in thermals

I've been reading techpowerup for many years, and this is the first time I'm so disappointed with the reviews of the Ryzen 5000 series.
No one seems to want to read the data, otherwise you would have noticed some obvious errors.
It's seems like a sponsored review, because there are too many errors in the precess and data.

If you want to test a CPU, you must use tests that highlight the CPU bottleneck, not the ones in wich it's obvious that the GPU is the bottleneck.
You said that 2080Ti and Ultra details it's the best case scenario to replicate the most common use of the CPU.
But it's WRONG! You are TESTING A CPU, not a system.
If you want to test a CPU you have to avoid GPU bottleneck, so use the best GPU on the market like 3080 or 3090, like all the other best reviewers on the web.
The RAM speed/latency and number of modules are not the problem, because they tested that both brands (Intel and AMD) reach better perfomance.

If you see in your CPU test that a slower or older CPU reach better performance than the newer models...maybe you are doing it wrong.
How it's even possible that a i9 10900K, a CPU with 10 core (14nm), 5GHz, and tdp of over 200W can reach lower temps in a stress test than a Ryzen 5600X with 6 core (7nm), 4,6GHz, tdp 65W?!
C'mon! We are not stupid, and the rest of the tech reviewers are telling us the opposite!
I'm a fan of PC world and I immediately noticed that something was wrong on your chart, I wonder how a professional in this field with a lot of years of experience like you can't see the same...

Like A.Einstein said: "Everyone is a genius. But if you judge a fish by its ability to climb a tree, it will live its whole life believing that it is stupid."
If you test a CPU without focusing on actual CPU performance, you are doing a useless job.

You are an important tech site, you have to delete all the wrong reviews and remake a totally update version with all the benchmarks corrected.
power-comparison.png
 
I've been reading techpowerup for many years, and this is the first time I'm so disappointed with the reviews of the Ryzen 5000 series.
No one seems to want to read the data, otherwise you would have noticed some obvious errors.
It's seems like a sponsored review, because there are too many errors in the precess and data.

If you want to test a CPU, you must use tests that highlight the CPU bottleneck, not the ones in wich it's obvious that the GPU is the bottleneck.
You said that 2080Ti and Ultra details it's the best case scenario to replicate the most common use of the CPU.
But it's WRONG! You are TESTING A CPU, not a system.
If you want to test a CPU you have to avoid GPU bottleneck, so use the best GPU on the market like 3080 or 3090, like all the other best reviewers on the web.
The RAM speed/latency and number of modules are not the problem, because they tested that both brands (Intel and AMD) reach better perfomance.

If you see in your CPU test that a slower or older CPU reach better performance than the newer models...maybe you are doing it wrong.
How it's even possible that a i9 10900K, a CPU with 10 core (14nm), 5GHz, and tdp of over 200W can reach lower temps in a stress test than a Ryzen 5600X with 6 core (7nm), 4,6GHz, tdp 65W?!
C'mon! We are not stupid, and the rest of the tech reviewers are telling us the opposite!
I'm a fan of PC world and I immediately noticed that something was wrong on your chart, I wonder how a professional in this field with a lot of years of experience like you can't see the same...

Like A.Einstein said: "Everyone is a genius. But if you judge a fish by its ability to climb a tree, it will live its whole life believing that it is stupid."
If you test a CPU without focusing on actual CPU performance, you are doing a useless job.

You are an important tech site, you have to delete all the wrong reviews and remake a totally update version with all the benchmarks corrected.
View attachment 176121

Small die area = hot spot. Hot spot does not equal total heat output (watts). It's the same reason you can burn your finger, yet not turn your immediate environment into a flaming inferno when you flick a lighter on.

5600X will run hot at the die, but not pull a lot of power. 10900k can run cooler at the die sensor, but pull a ton of juice. Both wattage values are well within the dissipation abilities of a basic 240mm AIO.
 
You are an important tech site, you have to delete all the wrong reviews and remake a totally update version with all the benchmarks corrected.
You have to stop being so ignorant.
 
Small die area = hot spot. Hot spot does not equal total heat output (watts). It's the same reason you can burn your finger, yet not turn your immediate environment into a flaming inferno when you flick a lighter on.

5600X will run hot at the die, but not pull a lot of power. 10900k can run cooler at the die sensor, but pull a ton of juice. Both wattage values are well within the dissipation abilities of a basic 240mm AIO.


...maybe someone forgot to show the correct temperature, without the 65W tdp factory lock?




cpu-temperature.png


You have to stop being so ignorant.

explain why they made an article wich analyze the problem in the reviews, but no mention of this article is shown in the original review of 5900X, 5800X and 5600X

If the site was mine, i would be the first to want to inform my user that i've found a problem.

I mean, everyone can make a mistake, but you should be more clear and inform the users.
If I am wrong, why the rest of the tech website shows different results?
 
SLI is dead, NVIDIA disabled implicit SLI (the SLI you're thinking of). It will no longer work in nearly all games.


Umm I don't think anyone actually plays Ashes of the Singularity. It's only used as benchmark. I still feel like Civ the better choice of all those?

That's the point of Ashes of the Singularity being mentioned, it is a benchmark. In my opinion, Total War (current total war games) or AotS can achieve the benchmarking desires for most users. I didn't include a game like Cities Skylines, but it is a great benchmark too.

To me CIV is like running DotA 2 or CS:GO as benchmark.
 
So it absolutely spanks everything as expected, except GPU-limited gaming, in which case you could afford a 5950X so you can afford a 3090 too.
 
The 3800 memory in gaming seems very pro-AMD and I won't complain a bit. Competition is GREAT to see. However wouldn't it be more fair to include the same memory speed on the Intel variants too in order to show that level of gain comparison as well?

Nonetheless us consumers are really benefiting from the improvement AMD has made lately. Bravo!!!
 
The 3800 memory in gaming seems very pro-AMD and I won't complain a bit. Competition is GREAT to see. However wouldn't it be more fair to include the same memory speed on the Intel variants too in order to show that level of gain comparison as well?

Nonetheless us consumers are really benefiting from the improvement AMD has made lately. Bravo!!!

Your marked improvement comment is noted, however, it's not worth stating improvement when the term should be decisive victory. The 5950x is the best consumer CPU on the market. Intel top line CPUs get blown away in everything outside of gaming performance, when the performance is heavily GPU dependent and the consumer system isn't pairing a top tier CPU with top tier components, i.e., memory and GPU.

You're stating W1z didn't perform the same testing on the Intel variant, and that is false. The same testing was performed on top tier which is essentially what is being compared here.

Once again, this cycle has shown Intel is backed into superficial (kuddos for W1z for digging in deep on this) corner of being crowned king of gaming, at the cost of low efficiency and absurd power demands at this point in time. Intel will eventually outsource fabbing of their CPUs to please shareholders.
 
Your marked improvement comment is noted, however, it's not worth stating improvement when the term should be decisive victory. The 5950x is the best consumer CPU on the market. Intel top line CPUs get blown away in everything outside of gaming performance, when the performance is heavily GPU dependent and the consumer system isn't pairing a top tier CPU with top tier components, i.e., memory and GPU.

You're stating W1z didn't perform the same testing on the Intel variant, and that is false. The same testing was performed on top tier which is essentially what is being compared here.

Once again, this cycle has shown Intel is backed into superficial (kuddos for W1z for digging in deep on this) corner of being crowned king of gaming, at the cost of low efficiency and absurd power demands at this point in time. Intel will eventually outsource fabbing of their CPUs to please shareholders.

I'm not up-to-date on how RAM affects systems nowadays. Golly, I still make jokes to friends about being late because I got on the wrong front-side bus!

I'm not a big overclocker and in the past few years I've been a little behind on my knowledge of some of the ram, clock, busses, and many of those other relationships that cntribute to performance figures. I have a surface-level impression that Ryzen benefits from faster RAM more than Intel machines would however I realize I could be wrong.

I was a little confused when all of a sudden a graph liek that shows up. Whoa! It...looks neat but its making me now ask more questions.

EDIT: apoloigies if my post possibly came off as "look fast ram make go fast why? AMD rUl3z! LoLz" which is certainly no my intent. I just noticed the 5950X was the only one with the extra RAM speed and was curious why overall

DOUBLE EDIT: OPPS! I didn't read the chart well enough. I see the system check with the 3090, better ram, and more NOW! hahahaha. Ok, I think I just didn;t read close enough. Carry on folks, carry on.
 
Last edited:
As much as I like Civ games, I don't think FPS is that important in turn-based games... so a real time strategy game would probably be more appropriate (IMO)

Civ VI main benefit is its beautiful cinematography and effects.

If you were actually in the mood for a strategic game however... you pretty much should stick to... well... any other Civ that ever came out. I think Civ VI oversimplifies the mechanics of the series. I realize that its important to simplify things for new audiences and all... but its almost anti-strategy. A lot of decisions (tall vs wide, workers/builders vs settler, etc. etc.) are streamlined to the point where the optimal strategy is obvious.

While earlier Civ games were much more difficult to make optimal decisions.
 
Civ VI main benefit is its beautiful cinematography and effects.

If you were actually in the mood for a strategic game however... you pretty much should stick to... well... any other Civ that ever came out. I think Civ VI oversimplifies the mechanics of the series. I realize that its important to simplify things for new audiences and all... but its almost anti-strategy. A lot of decisions (tall vs wide, workers/builders vs settler, etc. etc.) are streamlined to the point where the optimal strategy is obvious.

While earlier Civ games were much more difficult to make optimal decisions.
I’m not downplaying anything you’re stating. I love CIV games just like I love DotA 2. The problem is I don’t think either is worthy of benchmarking, and that’s not their purpose in the market either.
 
I compared it to 16 cores category , it's one of the cheapest ever released has 16 cores.
But you can get your work done with 12 cores too, at lower hardware cost, just have to wait a little bit longer, which is acceptable for most
 
Back
Top