• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen Threadripper 2950X

Here's the thing: average person doesn't need a rendering workstation. This is a CPU for a... hard to define group of PC users (even if it was performing like a 16-core should). The amount of focus it gets on the internet is just weird...
We're getting second generation of a CPU that everyone talks about but no one buys... but where are AMD-powered high-end notebooks? :)
You know just because you think it to be so, it doesn't make it true...
If you check the sales over at Mindfactory, which are a very good approximation of what is going on across the EU, you'll find that the 19xx Threadrippers have sold 2.5 times more CPUs than Intel's Core i9.
As for the US, the other day Amazon did a sale of 1950X where all their inventory disappeared in the matter of hours.
Give people good and cheap tech - they buy it. They may not NEED it per se, but since when has that stopped any smartphone user? ;)
 
Here's the thing: average person doesn't need a rendering workstation. This is a CPU for a... hard to define group of PC users (even if it was performing like a 16-core should). The amount of focus it gets on the internet is just weird...
We're getting second generation of a CPU that everyone talks about but no one buys... but where are AMD-powered high-end notebooks? :)

That's the life of a HEDT SKU. It doesn't need to make sense, it's just a show-off. And a preview of things about(ish) to trickle down to mainstream.
 
Last edited by a moderator:
Let me know when intel has 32 cores in the next month and how much it overclocks! Commone waiting for it lol. Yea you got nothing. Watch 7nm hit on zen 2 and you will have to make up an imaginary intel processor to compete. Zen 2 will probably take the lead in IPC with its 10-15% and on 7nm you think they would be able to hit above 4.5ghz? Common now! LoL

So far it looks like you will be disappointed there. The 2950X has been king of this release with the 32-core monster so far only being really good at rendering benchmarks and tasks. Since the memory is only connected to the main to CCX modules, the additional cores beyond the 16 we have here, result in average per core memory throughput at full load tanking hard. the 2990WX also shows to have not as high a overclock even with XFR and massive gains in power consumption to the tune of 600 Watts plus if you even try.

So if you have a mixed of work and play, the 2950X is going to end up being the best bang for your buck by a mile.
 
So far it looks like you will be disappointed there. The 2950X has been king of this release with the 32-core monster so far only being really good at rendering benchmarks and tasks. Since the memory is only connected to the main to CCX modules, the additional cores beyond the 16 we have here, result in average per core memory throughput at full load tanking hard. the 2990WX also shows to have not as high a overclock even with XFR and massive gains in power consumption to the tune of 600 Watts plus if you even try.

So if you have a mixed of work and play, the 2950X is going to end up being the best bang for your buck by a mile.

I don't understand how that reply has anything to do with what he posted?
 
I had to go elsewhere to find min fps testing again... sigh... Intel destroys this by 10 fps across the board, yet again. Add in 20 more FPS when you remove Denovu from games (min fps not avg), and you are talking 30 fps min gain... smoothness and consistency... looks like Intel is still king for games.
 
Good review. It stays objective. I do not want the ramblings of a tech-obsessed junkie. That is why I came here first.

I know that others have criticized W1zzard's listing the price as a con; however, he is consistent on his CPU reviews and he is consistent on calling out HEDT CPU prices. Look back over his previous CPU reviews and you will see that (along with his objectivity and consistency).

Nice work, W1zzard.
 
That's the life of a HEDT SKU. It doesn't need to make sense, it's just a show-off. And a preview of things about(ish) to trickle down to mainstream.
Agree. So why is everyone running around arguing that this is a workstation CPU?
Basically, if a gamer wants to praise a CPU that he wouldn't want for gaming, he's calling it a "workstation" / "productivity" CPU. Like if that word added nobility or something. :-D

Workstations are certainly not for showing off (surely not compared to gaming desktops :-D). But they're also a dying breed in the age of cloud computing, so why is AMD pushing this niche so hard? They're making a lot more noise around TR than EPYC - it should be the other way around.

But going back from general ideas to the actual CPU - to me it's unattractive. Infinity fabric clearly can't feed the cores fast enough. The CPU performs well in "passive" tasks (e.g. encoding), but scales badly in other scenarios. They can easily keep adding more and more cores, but what for? Just to have the core count crown? Sad.
Intel is behind on core count, but Intel's Mesh scales much better (at a cost of higher power consumption).

If you check the sales over at Mindfactory, which are a very good approximation of what is going on across the EU, you'll find that the 19xx Threadrippers have sold 2.5 times more CPUs than Intel's Core i9.
Because no one buys 2066 either. But while Intel basically admits this is an enthusiast (show-off, niche) product, AMD tries to convince us Threadripper is perfect for serious work. Sadly, they failed to convince major workstation manufacturers. :)

Sales at Mindfactory or any other similar store are not even close to "what's going on"... anywhere. Almost all business-oriented computers are sold by 4 parties: Dell, HPE, Lenovo and Apple.
 
Last edited:
@W1zzard Just to note that it's not "on the fly" if a reboot is required.
 
Is "Clock Frequencies and Boost Clock Analysis" run on stock settings or with PBO enabled? 4100 all core turbo without PBO doesn't seem right, other reviews show all core turbo around 3700-3750. 4100 with PBO is something to be expected though. Can you elaborate on this?
 
Is "Clock Frequencies and Boost Clock Analysis" run on stock settings or with PBO enabled? 4100 all core turbo without PBO doesn't seem right, other reviews show all core turbo around 3700-3750. 4100 with PBO is something to be expected though. Can you elaborate on this?
It is at stock. I'm not using OCCT or similar crazy pure stress workloads but something more similar to a multi-threaded SuperPi, so actual application usage is reflected
 
It is at stock. I'm not using OCCT or similar crazy pure stress workloads but something more similar to a multi-threaded SuperPi, so actual application usage is reflected
Thanks for the info. Is it possible then to do similar analysis with PBO enabled?
 
@W1zzard I know it would be a lot of work, but I'd be curious to see how it performs with a Vega 64/RX580.

Maybe even 2 or 3 games - one AMD favouring, one nVidia and one "in the middle" to see the CPU scaling with AMD architecture/drivers.

Thanks.
 
$900 doesn't sound like a "negative" considering Intel's similarly priced offerings have less count on the paper specs, though with that said, newer architectures will need time to mature. But with how good it performs on release day; I say job well done to AMD for finally coming back into the competition.
 
Loved the review and loving the performance for the money.. I guess if you are looking to spend $900/ÂŁ900 on a CPU, the rest of the system is going to be expensive regardless so price is whatever it is :)

I'd love one and the 2990WX.. Just because :) Both would be amazing crunchers to start with but just supporting AMD with these new CPU's would be a pleasure and with the gaming performance being what it is for the type of CPUs they are, I'd happily take a few FPS hit over paying for the Intel counterparts.
I'm sure my 5960X is a worthy CPU for gaming and such.. But these new AMD ones are the next level that I'd love to be a part of :) Looks like there's still a reason for the 1200w+ PSU's still after all :)
 
Another thought, PCWorld (no, not the crappy Dixon group one) were showing that performance of blender or one of those actually increased when cinebench was running at the same time without lowering the cinebench (ST I think it was) score. This was on the 32 core so the same may not apply to the 2950X of course.

So it may well be that some tweaking is still needed to the boost algorithm to load/clock the cores better for various loads/workloads and combinations. It may be that MS need to adjust their task scheduler yet again, this may well be because of the two dies that have no RAM attached on the 2990X.
 
Last edited:
Just as a general note... Is there a description of the benchmarks used somewhere that decribes the individual operations number of operations in a script for exampe ? With so much reliance on benchmarks by consumers, I'm oft left wondering what the real world impacts are. I look at SSD benchmarks .... and all of out builds have one ... but when a go into BIOS and have the box boot off the SSHD instead of the SSD ... no one ever notices. If that's the case, just how do we evaluate those benchmarks ?

For example, how significant is it if a CPU performace a MS Office script in 600ms or 450 ms if that script includes 62 operations all of which require a keystroke in between. If it takes 8 seconds versus 4 seconds to rotate an image in AutoCAD, Ok maybe I gain 2.5 seconds accounting for reaction time in seeking / deciding what my next action is, and moving hands to the KB to input it. If rendering a drawing takes 25 minutes versus 40 minutes, yes that is big. But I'd like to have an updated understanding of just what each benchmark involves. After all isn't any script which includes say 60 operations and takes less than 60 seconds meaningless when the user is in the chain ? It's kinda like traversing downtime in a major city.... doesn't have a big impact if one bike rider can traverse 120 blocks twice as fast if they have to stop and wait for pedestrian traffic to cross on every 2nd block allowing the slower riser to catch up.
 
Just as a general note... Is there a description of the benchmarks used somewhere that decribes the individual operations number of operations in a script for exampe ? With so much reliance on benchmarks by consumers, I'm oft left wondering what the real world impacts are. I look at SSD benchmarks .... and all of out builds have one ... but when a go into BIOS and have the box boot off the SSHD instead of the SSD ... no one ever notices. If that's the case, just how do we evaluate those benchmarks ?

For example, how significant is it if a CPU performace a MS Office script in 600ms or 450 ms if that script includes 62 operations all of which require a keystroke in between. If it takes 8 seconds versus 4 seconds to rotate an image in AutoCAD, Ok maybe I gain 2.5 seconds accounting for reaction time in seeking / deciding what my next action is, and moving hands to the KB to input it. If rendering a drawing takes 25 minutes versus 40 minutes, yes that is big. But I'd like to have an updated understanding of just what each benchmark involves. After all isn't any script which includes say 60 operations and takes less than 60 seconds meaningless when the user is in the chain ? It's kinda like traversing downtime in a major city.... doesn't have a big impact if one bike rider can traverse 120 blocks twice as fast if they have to stop and wait for pedestrian traffic to cross on every 2nd block allowing the slower riser to catch up.


Batch processing, when opening a large dependent shared database, or excel file the clicks don't happen, and the results may need to be recomputed if importing different values, which results in longer wait times. For example if we have a excel spreadsheet with 200K entries and we would like to update 100K items with new values based on rules and export the differences to see what effects were, faster is better, there are multiple functions happening, sort, add, subtract, multiply, divide etc....
 
@W1zzard : dunno if you still have the CPU's used for the review but, if you do, consider making a test with more then one app simultaneously, instead of only one @ a time.

Something along the lines of what PCWorld did for the 2990WX (bottom half of page two). Perhaps a combination of two or three of the fastest tests so that there's more strain on the CPUs.
 
Because no one buys 2066 either. But while Intel basically admits this is an enthusiast (show-off, niche) product, AMD tries to convince us Threadripper is perfect for serious work. Sadly, they failed to convince major workstation manufacturers. :)
Sales at Mindfactory or any other similar store are not even close to "what's going on"... anywhere. Almost all business-oriented computers are sold by 4 parties: Dell, HPE, Lenovo and Apple.
Uhm yeah, I'm just going to copy-paste what Intel says about their own products:

Maximize Performance
Whether you are working on your latest feature film or the next episode of a YouTube* series, the unlocked Intel® Core™ X-series processors are designed to scale to your performance needs by using the two fastest cores at higher frequencies and up to 18 cores when extreme mega tasking is required. Experience extreme performance, immersive 4K visuals, high speed storage and memory, and the latest technological advancements – all designed to get you from planning to final product faster than ever.
a1002014-male-ar-developer-helmet-16x9.jpg.rendition.intel.web.480.270.jpg

Power Your Creativity
Spend more time creating and less time waiting. The Intel® Core X-series processor can handle your most demanding workload. Upload and edit your 360˚ videos quickly and experience VR videos–all in stunning 4K. There are no limits to what you can create on your new computer.
a1002097-man-monitoring-operations-16x9.jpg.rendition.intel.web.480.270.jpg

Mega-Task to the Extreme
When creating your best work, you need the most responsive technology to handle multiple, CPU-intensive tasks at once. With an Intel® Core X-series processor, you can edit your video, render 3D effects, and compose the soundtrack simultaneously without compromising your computer’s performance.

It's literally ALL about serious work.

You're like that other guy who has made up his own definition of HEDT, but for "workstation", cherry-picking certain words and ending up convincing himself that the end result is some kind of holy gospel...

Well what can I say - you make for a pretty shitty preacher mate xD
 
I think the 12nm manufacturing process has improved from when the 2700X rolled off the production line over 4 months ago, allowing more chips to hit higher clocks. But of course, more than that, these are higher-binned, top 5% chips (EPYC to Threadripper to Ryzen is the priority order) that can clock higher.

They could certainly release a 2800X on AM4 in October/November with a 4.5Ghz boost on one core potentially, but it would have to be cheaper than the 9900K. But seeing as Intel will give that a premium price of around $450, there would be a lot of room for manoeuvre as they could release for $350 and drop the price of the 2700X down.

Having said all that, I don't think a 2800X is likely :) As by the time the 9900K releases in October, we'd be quite close to the release of 7nm Ryzen if the Q1 2019 release rumour is to be believed.

I think you're dreaming that a 9900k will cost only $450. I can't see it going for cheaper than a 7820x, so I'd bet money on around $550, maybe more(the 9700k may come in at the 7820x's price point of $470-500). The price may go down once 7nm Ryzen comes out, if AMD can get clock rates up.

It's still a lot of money? I tried to explain that near the end of the conclusion

I understand in a way, but price is always relative to competition and the competition in this case gives you 6 less cores for the same amount of money ($45 more actually). I'd say the price is expensive if all one wanted to do with it was game/stream, but if someone's building a workstation, the price actually deserves to be in the pro category, not the con.
 
I understand in a way, but price is always relative to competition and the competition in this case gives you 6 less cores for the same amount of money ($45 more actually). I'd say the price is expensive if all one wanted to do with it was game/stream, but if someone's building a workstation, the price actually deserves to be in the pro category, not the con.
Nope. Price is always absolute. The higher it it, the more stuff you give up buy one of these. One of anything really. That's why when for the price of one CPU you can get a (really) used car or pair of prosumer speakers, that price is a con.
 
I think you're dreaming that a 9900k will cost only $450. I can't see it going for cheaper than a 7820x, so I'd bet money on around $550, maybe more(the 9700k may come in at the 7820x's price point of $470-500). The price may go down once 7nm Ryzen comes out, if AMD can get clock rates up.
.

They can't price it more than $500 when it won't be noticeably faster than their competitors top 8-C 16-T 2700X, which by the time of release, could see a price cut to around $275-300. I mean I know people (foolishly) pay a premium for Intel, but an extra $200-250 for a slightly faster 8-cores would take the cake.

The 7820X is a premium HEDT CPU. 9900K is mainstream desktop. They'll slap at most a $400 price tag on it and totally wipe out their newest HEDT CPUs which won't be hard as no-one bought them and they were already trying to pretend Skylake-X was a bad dream anyway.
 
Nope. Price is always absolute. The higher it it, the more stuff you give up buy one of these. One of anything really. That's why when for the price of one CPU you can get a (really) used car or pair of prosumer speakers, that price is a con.

Some people can buy a $1,000,000 house, some can only afford to rent. Everyone has different budgets.

At what point do you draw the line at price being a pro or a con? The answer is you can't, so you need to compare it to it's competition.

They can't price it more than $500 when it won't be noticeably faster than their competitors top 8-C 16-T 2700X, which by the time of release, could see a price cut to around $275-300. I mean I know people (foolishly) pay a premium for Intel, but an extra $200-250 for a slightly faster 8-cores would take the cake.

The 7820X is a premium HEDT CPU. 9900K is mainstream desktop. They'll slap at most a $400 price tag on it and totally wipe out their newest HEDT CPUs which won't be hard as no-one bought them and they were already trying to pretend Skylake-X was a bad dream anyway.

7820x was a budget HEDT CPU that had only 1 HEDT feature, quad channel memory. It was really just a budget entry so people could spend more $$ down the line on the x299 platform.

There's a reason why intel suddenly started using the i9 moniker. It signifies their ultra-high performance line, and will come at a premium price. In fact you can see it in the name, i7 7820x. The processor after that was the i9 7900. Expect the i9's to be more expensive, they're clocked higher(rumor) than the 7820x, and the only thing they don't have that the 7820x has is quad channel memory.

These i9's will be like Titan's from Nvidia. Super-enthusiasts will buy them even if they're priced at $500(9700k) & $550(9900k).

Of course you could be right and I could be wrong, we'll see eventually either way.
 
Last edited:
I had to go elsewhere to find min fps testing again... sigh... Intel destroys this by 10 fps across the board, yet again. Add in 20 more FPS when you remove Denovu from games (min fps not avg), and you are talking 30 fps min gain... smoothness and consistency... looks like Intel is still king for games.

Please use percentages rather than fps, fps are meaningless unless you know the base. 10fps when you are putting out 150 is vastly different to 10fps when you are putting out 30 for example.
 
Some people can buy a $1,000,000 house, some can only afford to rent. Everyone has different budgets.

At what point do you draw the line at price being a pro or a con? The answer is you can't, so you need to compare it to it's competition.

The answer is you can't. The cost of opportunity is the same for everyone.
 
Back
Top