# AMD Ryzen Threadripper 2950X



## W1zzard (Aug 13, 2018)

Ryzen Threadripper 2950X is AMD's new flagship 16-core processor. Precision Boost Overclock works wonders to further increase its performance while always being stable. Our review of the 2950X presents four data sets: stock, manual OC, PBO enabled and PBO with Local Memory Access mode.

*Show full review*


----------



## Ferrum Master (Aug 13, 2018)

Did AMD really send you a review sample?


----------



## LiveOrDie (Aug 13, 2018)

Nice review but its pretty clear it would be faster than a lower core CPU but we also didn't see any OC tests for the i9 so the only thing you can really compare is the price.


----------



## Joss (Aug 13, 2018)

Page 18 


> which would otherwise *we* unstable with a manual overclock


----------



## R0H1T (Aug 13, 2018)

Does the 2950x come with just 2 functional dies, like the previous gen 1950x TR1, or 4 dies?
Just saw page 3


----------



## Tsukiyomi91 (Aug 13, 2018)

for $900, the same price as the i9-7900X with 6 more cores, better clock speeds, beefier cache & 2nm smaller than Intel's similarly priced HEDT processor, I say no one will complain of getting one... Though I know one thing; the packaging box it comes with is dope AF!


----------



## londiste (Aug 13, 2018)

Are you sure there were no motherboard vendor "optimizations" enabled?
Both 4.1 GHz boost on all cores as well as 237/280W power consumption sound suspicious.


----------



## TheGuruStud (Aug 13, 2018)

Those are some seriously impressive clocks for the process/core count.


----------



## mcraygsx (Aug 13, 2018)

Fantastic Review, I still cannot get over the fact how energy-efficient Zen architecture really is with that many cores and clock speed as compare to anything INTEL has to offer.

I think its time for INTEL to design something beyond CORE architecture. At same time I cannot wait for your review of TR 2990WX.


----------



## R0H1T (Aug 13, 2018)

londiste said:


> Are you sure there were no motherboard vendor "optimizations" enabled?
> Both 4.1 GHz boost on all cores as well as 237/280W power consumption sound suspicious.


Zen doesn't need separate motherboard boost, it has XFR2 and PB2 for better clocks out of the box.


----------



## Vya Domus (Aug 13, 2018)

I don't quite get it as to why the price is listed as disadvantage.


----------



## mcraygsx (Aug 13, 2018)

Vya Domus said:


> I don't quite get it as to why the price is listed as disadvantage.



That is what I was asking myself earlier. AMD is the reason an average person can build a decent rendering workstation for a decent price now a day. Lets not forget there is additional saving of not have to upgrade motherboard in near future unlike its competitor. At least give them some credit?


----------



## W1zzard (Aug 13, 2018)

Vya Domus said:


> I don't quite get it as to why the price is listed as disadvantage.


It's still a lot of money? I tried to explain that near the end of the conclusion


----------



## Vya Domus (Aug 13, 2018)

W1zzard said:


> It's still a lot of money? I tried to explain that near the end of the conclusion



Of course it is but I believe that needs to be addressed within a context , otherwise there isn't much point in mentioning it.


----------



## HTC (Aug 13, 2018)

Vya Domus said:


> I don't quite get it as to why the price is listed as disadvantage.





W1zzard said:


> It's still a lot of money? I tried to explain that near the end of the conclusion



While i agree that it's a lot of money, bearing in mind the use case *this is supposed for*, this is actually cheap and should have been a plus rather then a minus.

If one buys is *for gaming only*, then @W1zzard is 100% correct, but who in their right mind would do that?


----------



## $ReaPeR$ (Aug 13, 2018)

good one w1zz! thank you


----------



## bug (Aug 13, 2018)

Vya Domus said:


> Of course it is but I believe that needs to be addressed within a context , otherwise there isn't much point in mentioning it.


So you'd be ok with not listing the price of a Titan V as a con either? 

You can add as much context as you want, HEDT CPUs aren't mean for mainstream. They're priced out of mainstream, so the price is always a con.


----------



## FeelinFroggy (Aug 13, 2018)

@W1zzard Love the review, too bad you could not throw in the 1950x data to see the improvement from the 2950x.


----------



## W1zzard (Aug 13, 2018)

FeelinFroggy said:


> @W1zzard Love the review, too bad you could not throw in the 1950x data to see the improvement from the 2950x.


Unfortunately AMD had no 1950X that I could borrow, and the one I borrowed for the 7900X review was returned a long time ago


----------



## Joss (Aug 13, 2018)

If this CPU can boost to 4.4 GHz on two cores, a 2800X (if it materializes) could do what... 4.6 GHz?


----------



## XiGMAKiD (Aug 13, 2018)

The next impressive thing after the price is still the multithreaded energy efficiency, AMD needs to keep that up


----------



## Shatun_Bear (Aug 13, 2018)

In the summary, overclocking approaching 5Ghz?! Wow, was this a typo?


----------



## notb (Aug 13, 2018)

IMO only 4 results in the review matter: database, Euler3D, VM and Tensorflow. All are obviously disappointing, so it's hard to understand the final score this CPU got...
Putting aside simple multicore tasks (like encoding), this CPU performs like a 10-core Ryzen (but thankfully, also consumes that much power).
How is this possible?


mcraygsx said:


> That is what I was asking myself earlier. AMD is the reason an average person can build a decent rendering workstation for a decent price now a day.


Here's the thing: average person doesn't need a rendering workstation. This is a CPU for a... hard to define group of PC users (even if it was performing like a 16-core should). The amount of focus it gets on the internet is just weird...
We're getting second generation of a CPU that everyone talks about but no one buys... but where are AMD-powered high-end notebooks? 


Tsukiyomi91 said:


> for $900, the same price as the i9-7900X with 6 more cores, better clock speeds, beefier cache & 2nm smaller than Intel's similarly priced HEDT processor, I say no one will complain of getting one...


Seriously? What happened to the world. You prefer one CPU over another because it's made on a different node?


----------



## TheGuruStud (Aug 13, 2018)

bug said:


> So you'd be ok with not listing the price of a Titan V as a con either?
> 
> You can add as much context as you want, HEDT CPUs aren't mean for mainstream. They're priced out of mainstream, so the price is always a con.



Titan V is a glorified gaming card. Ram error piece of crap.


----------



## Shatun_Bear (Aug 13, 2018)

Joss said:


> If this CPU can boost to 4.4 GHz on two cores, a 2800X (if it materializes) could do what... 4.6 GHz?



I think the 12nm manufacturing process has improved from when the 2700X rolled off the production line over 4 months ago, allowing more chips to hit higher clocks. But of course, more than that, these are higher-binned, top 5% chips (EPYC to Threadripper to Ryzen is the priority order) that can clock higher.

They could certainly release a 2800X on AM4 in October/November with a 4.5Ghz boost on one core potentially, but it would have to be cheaper than the 9900K. But seeing as Intel will give that a premium price of around $450, there would be a lot of room for manoeuvre as they could release for $350 and drop the price of the 2700X down.

Having said all that, I don't think a 2800X is likely  As by the time the 9900K releases in October, we'd be quite close to the release of 7nm Ryzen if the Q1 2019 release rumour is to be believed.


----------



## siluro818 (Aug 13, 2018)

notb said:


> Here's the thing: average person doesn't need a rendering workstation. This is a CPU for a... hard to define group of PC users (even if it was performing like a 16-core should). The amount of focus it gets on the internet is just weird...
> We're getting second generation of a CPU that everyone talks about but no one buys... but where are AMD-powered high-end notebooks?


You know just because you think it to be so, it doesn't make it true...
If you check the sales over at Mindfactory, which are a very good approximation of what is going on across the EU, you'll find that the 19xx Threadrippers have sold 2.5 times more CPUs than Intel's Core i9.
As for the US, the other day Amazon did a sale of 1950X where all their inventory disappeared in the matter of hours.
Give people good and cheap tech - they buy it. They may not NEED it per se, but since when has that stopped any smartphone user?


----------



## bug (Aug 13, 2018)

notb said:


> Here's the thing: average person doesn't need a rendering workstation. This is a CPU for a... hard to define group of PC users (even if it was performing like a 16-core should). The amount of focus it gets on the internet is just weird...
> We're getting second generation of a CPU that everyone talks about but no one buys... but where are AMD-powered high-end notebooks?



That's the life of a HEDT SKU. It doesn't need to make sense, it's just a show-off. And a preview of things about(ish) to trickle down to mainstream.


----------



## TheLaughingMan (Aug 13, 2018)

Nkd said:


> Let me know when intel has 32 cores  in the next month and how much it overclocks! Commone waiting for it lol. Yea you got nothing. Watch 7nm hit on zen 2 and you will have to make up an imaginary intel processor to compete. Zen 2 will probably take the lead in IPC with its 10-15% and on 7nm you think they would be able to hit above 4.5ghz? Common now! LoL



So far it looks like you will be disappointed there. The 2950X has been king of this release with the 32-core monster so far only being really good at rendering benchmarks and tasks. Since the memory is only connected to the main to CCX modules, the additional cores beyond the 16 we have here, result in average per core memory throughput at full load tanking hard. the 2990WX also shows to have not as high a overclock even with XFR and massive gains in power consumption to the tune of 600 Watts plus if you even try.

So if you have a mixed of work and play, the 2950X is going to end up being the best bang for your buck by a mile.


----------



## Shatun_Bear (Aug 13, 2018)

TheLaughingMan said:


> So far it looks like you will be disappointed there. The 2950X has been king of this release with the 32-core monster so far only being really good at rendering benchmarks and tasks. Since the memory is only connected to the main to CCX modules, the additional cores beyond the 16 we have here, result in average per core memory throughput at full load tanking hard. the 2990WX also shows to have not as high a overclock even with XFR and massive gains in power consumption to the tune of 600 Watts plus if you even try.
> 
> So if you have a mixed of work and play, the 2950X is going to end up being the best bang for your buck by a mile.



I don't understand how that reply has anything to do with what he posted?


----------



## Space Lynx (Aug 14, 2018)

I had to go elsewhere to find min fps testing again... sigh... Intel destroys this by 10 fps across the board, yet again. Add in 20 more FPS when you remove Denovu from games (min fps not avg), and you are talking 30 fps min gain... smoothness and consistency... looks like Intel is still king for games.


----------



## ironcerealbox (Aug 14, 2018)

Good review. It stays objective. I do not want the ramblings of a tech-obsessed junkie. That is why I came here first.

I know that others have criticized W1zzard's listing the price as a con; however, he is consistent on his CPU reviews and he is consistent on calling out HEDT CPU prices. Look back over his previous CPU reviews and you will see that (along with his objectivity and consistency).

Nice work, W1zzard.


----------



## notb (Aug 14, 2018)

bug said:


> That's the life of a HEDT SKU. It doesn't need to make sense, it's just a show-off. And a preview of things about(ish) to trickle down to mainstream.


Agree. So why is everyone running around arguing that this is a workstation CPU?
Basically, if a gamer wants to praise a CPU that he wouldn't want for gaming, he's calling it a "workstation" / "productivity" CPU. Like if that word added nobility or something. :-D

Workstations are certainly not for showing off (surely not compared to gaming desktops :-D). But they're also a dying breed in the age of cloud computing, so why is AMD pushing this niche so hard? They're making a lot more noise around TR than EPYC - it should be the other way around.

But going back from general ideas to the actual CPU - to me it's unattractive. Infinity fabric clearly can't feed the cores fast enough. The CPU performs well in "passive" tasks (e.g. encoding), but scales badly in other scenarios. They can easily keep adding more and more cores, but what for? Just to have the core count crown? Sad.
Intel is behind on core count, but Intel's Mesh scales much better (at a cost of higher power consumption).



siluro818 said:


> If you check the sales over at Mindfactory, which are a very good approximation of what is going on across the EU, you'll find that the 19xx Threadrippers have sold 2.5 times more CPUs than Intel's Core i9.


Because no one buys 2066 either. But while Intel basically admits this is an enthusiast (show-off, niche) product, AMD tries to convince us Threadripper is perfect for serious work. Sadly, they failed to convince major workstation manufacturers. 

Sales at Mindfactory or any other similar store are not even close to "what's going on"... anywhere. Almost all business-oriented computers are sold by 4 parties: Dell, HPE, Lenovo and Apple.


----------



## nemesis.ie (Aug 14, 2018)

@W1zzard Just to note that it's not "on the fly" if a reboot is required.


----------



## malakudi (Aug 14, 2018)

Is "Clock Frequencies and Boost Clock Analysis" run on stock settings or with PBO enabled? 4100 all core turbo without PBO doesn't seem right, other reviews show all core turbo around 3700-3750. 4100 with PBO is something to be expected though. Can you elaborate on this?


----------



## W1zzard (Aug 14, 2018)

malakudi said:


> Is "Clock Frequencies and Boost Clock Analysis" run on stock settings or with PBO enabled? 4100 all core turbo without PBO doesn't seem right, other reviews show all core turbo around 3700-3750. 4100 with PBO is something to be expected though. Can you elaborate on this?


It is at stock. I'm not using OCCT or similar crazy pure stress workloads but something more similar to a multi-threaded SuperPi, so actual application usage is reflected


----------



## malakudi (Aug 14, 2018)

W1zzard said:


> It is at stock. I'm not using OCCT or similar crazy pure stress workloads but something more similar to a multi-threaded SuperPi, so actual application usage is reflected


Thanks for the info. Is it possible then to do similar analysis with PBO enabled?


----------



## nemesis.ie (Aug 14, 2018)

@W1zzard I know it would be a lot of work, but I'd be curious to see how it performs with a Vega 64/RX580.

Maybe even 2 or 3 games - one AMD favouring, one nVidia and one "in the middle" to see the CPU scaling with AMD architecture/drivers.

Thanks.


----------



## Tsukiyomi91 (Aug 14, 2018)

$900 doesn't sound like a "negative" considering Intel's similarly priced offerings have less count on the paper specs, though with that said, newer architectures will need time to mature. But with how good it performs on release day; I say job well done to AMD for finally coming back into the competition.


----------



## phill (Aug 14, 2018)

Loved the review and loving the performance for the money..  I guess if you are looking to spend $900/£900 on a CPU, the rest of the system is going to be expensive regardless so price is whatever it is  

I'd love one and the 2990WX..  Just because   Both would be amazing crunchers to start with but just supporting AMD with these new CPU's would be a pleasure and with the gaming performance being what it is for the type of CPUs they are, I'd happily take a few FPS hit over paying for the Intel counterparts.
I'm sure my 5960X is a worthy CPU for gaming and such.. But these new AMD ones are the next level that I'd love to be a part of   Looks like there's still a reason for the 1200w+ PSU's still after all


----------



## nemesis.ie (Aug 14, 2018)

Another thought, PCWorld (no, not the crappy Dixon group one) were showing that performance of blender or one of those actually increased when cinebench was running at the same time without lowering the cinebench (ST I think it was) score.  This was on the 32 core so the same may not apply to the 2950X of course.

So it may well be that some tweaking is still needed to the boost algorithm to load/clock the cores better for various loads/workloads and combinations. It may be that MS need to adjust their task scheduler yet again, this may well be because of the two dies that have no RAM attached on the 2990X.


----------



## John Naylor (Aug 14, 2018)

Just as a general note... Is there a description of the benchmarks used somewhere that decribes the individual operations  number of operations in a  script for exampe ?  With so much reliance on benchmarks by consumers, I'm oft left wondering what the real world impacts are.   I look at SSD benchmarks .... and all of out builds have one ... but when a go into BIOS and have the box boot off the SSHD instead of the SSD ... no one ever notices.   If that's the case, just how do we evaluate those benchmarks ?

For example, how significant is it if a CPU performace a MS Office script in 600ms or 450 ms if that script includes 62 operations all of which require a keystroke in between.   If it takes 8 seconds versus 4 seconds to rotate an image in AutoCAD, Ok maybe I gain 2.5 seconds accounting for reaction time in seeking / deciding what my next action is, and moving hands to the KB to input it.  If rendering a drawing takes 25 minutes versus 40 minutes, yes that is big.  But I'd like to have an updated understanding of just what each benchmark involves.   After all isn't any script which includes say 60 operations and takes less than 60 seconds meaningless when the user is in the chain  ?  It's kinda like traversing downtime in a major city.... doesn't have a big impact if one bike rider can traverse 120 blocks twice as fast if they have to stop and wait for pedestrian traffic to cross on every 2nd block allowing the slower riser to catch up.


----------



## Steevo (Aug 14, 2018)

John Naylor said:


> Just as a general note... Is there a description of the benchmarks used somewhere that decribes the individual operations  number of operations in a  script for exampe ?  With so much reliance on benchmarks by consumers, I'm oft left wondering what the real world impacts are.   I look at SSD benchmarks .... and all of out builds have one ... but when a go into BIOS and have the box boot off the SSHD instead of the SSD ... no one ever notices.   If that's the case, just how do we evaluate those benchmarks ?
> 
> For example, how significant is it if a CPU performace a MS Office script in 600ms or 450 ms if that script includes 62 operations all of which require a keystroke in between.   If it takes 8 seconds versus 4 seconds to rotate an image in AutoCAD, Ok maybe I gain 2.5 seconds accounting for reaction time in seeking / deciding what my next action is, and moving hands to the KB to input it.  If rendering a drawing takes 25 minutes versus 40 minutes, yes that is big.  But I'd like to have an updated understanding of just what each benchmark involves.   After all isn't any script which includes say 60 operations and takes less than 60 seconds meaningless when the user is in the chain  ?  It's kinda like traversing downtime in a major city.... doesn't have a big impact if one bike rider can traverse 120 blocks twice as fast if they have to stop and wait for pedestrian traffic to cross on every 2nd block allowing the slower riser to catch up.




Batch processing, when opening a large dependent shared database, or excel file the clicks don't happen, and the results may need to be recomputed if importing different values, which results in longer wait times. For example if we have a excel spreadsheet with 200K entries and we would like to update 100K items with new values based on rules and export the differences to see what effects were, faster is better, there are multiple functions happening, sort, add, subtract, multiply, divide etc....


----------



## HTC (Aug 14, 2018)

@W1zzard : dunno if you still have the CPU's used for the review but, if you do, consider making a test with more then one app simultaneously, instead of only one @ a time.

Something along the lines of what PCWorld did for the 2990WX (bottom half of page two). Perhaps a combination of two or three of the fastest tests so that there's more strain on the CPUs.


----------



## siluro818 (Aug 15, 2018)

notb said:


> Because no one buys 2066 either. But while Intel basically admits this is an enthusiast (show-off, niche) product, AMD tries to convince us Threadripper is perfect for serious work. Sadly, they failed to convince major workstation manufacturers.
> Sales at Mindfactory or any other similar store are not even close to "what's going on"... anywhere. Almost all business-oriented computers are sold by 4 parties: Dell, HPE, Lenovo and Apple.


Uhm yeah, I'm just going to copy-paste what Intel says about their own products:

*Maximize Performance*
Whether you are working on your latest feature film or the next episode of a YouTube* series, the unlocked Intel® Core™ X-series processors are designed to scale to your performance needs by using the two fastest cores at higher frequencies and up to 18 cores when extreme mega tasking is required. Experience extreme performance, immersive 4K visuals, high speed storage and memory, and the latest technological advancements – all designed to get you from planning to final product faster than ever.





*Power Your Creativity*
Spend more time creating and less time waiting. The Intel® Core X-series processor can handle your most demanding workload. Upload and edit your 360˚ videos quickly and experience VR videos–all in stunning 4K. There are no limits to what you can create on your new computer.




*Mega-Task to the Extreme*
When creating your best work, you need the most responsive technology to handle multiple, CPU-intensive tasks at once. With an Intel® Core X-series processor, you can edit your video, render 3D effects, and compose the soundtrack simultaneously without compromising your computer’s performance.

It's literally ALL about serious work.

You're like that other guy who has made up his own definition of HEDT, but for "workstation", cherry-picking certain words and ending up convincing himself that the end result is some kind of holy gospel...

Well what can I say - you make for a pretty shitty preacher mate xD


----------



## EatingDirt (Aug 15, 2018)

Shatun_Bear said:


> I think the 12nm manufacturing process has improved from when the 2700X rolled off the production line over 4 months ago, allowing more chips to hit higher clocks. But of course, more than that, these are higher-binned, top 5% chips (EPYC to Threadripper to Ryzen is the priority order) that can clock higher.
> 
> They could certainly release a 2800X on AM4 in October/November with a 4.5Ghz boost on one core potentially, but it would have to be cheaper than the 9900K. But seeing as Intel will give that a premium price of around $450, there would be a lot of room for manoeuvre as they could release for $350 and drop the price of the 2700X down.
> 
> Having said all that, I don't think a 2800X is likely  As by the time the 9900K releases in October, we'd be quite close to the release of 7nm Ryzen if the Q1 2019 release rumour is to be believed.



I think you're dreaming that a 9900k will cost only $450. I can't see it going for cheaper than a 7820x, so I'd bet money on around $550, maybe more(the 9700k may come in at the 7820x's price point of $470-500). The price may go down once 7nm Ryzen comes out, if AMD can get clock rates up.



W1zzard said:


> It's still a lot of money? I tried to explain that near the end of the conclusion



I understand in a way, but price is always relative to competition and the competition in this case gives you 6 less cores for the same amount of money ($45 more actually). I'd say the price is expensive if all one wanted to do with it was game/stream, but if someone's building a workstation, the price actually deserves to be in the pro category, not the con.


----------



## bug (Aug 15, 2018)

EatingDirt said:


> I understand in a way, but price is always relative to competition and the competition in this case gives you 6 less cores for the same amount of money ($45 more actually). I'd say the price is expensive if all one wanted to do with it was game/stream, but if someone's building a workstation, the price actually deserves to be in the pro category, not the con.


Nope. Price is always absolute. The higher it it, the more stuff you give up buy one of these. One of anything really. That's why when for the price of one CPU you can get a (really) used car or pair of prosumer speakers, that price is a con.


----------



## Shatun_Bear (Aug 15, 2018)

EatingDirt said:


> I think you're dreaming that a 9900k will cost only $450. I can't see it going for cheaper than a 7820x, *so I'd bet money on around $550*, maybe more(the 9700k may come in at the 7820x's price point of $470-500). The price may go down once 7nm Ryzen comes out, if AMD can get clock rates up.
> .



They can't price it more than $500 when it won't be noticeably faster than their competitors top 8-C 16-T 2700X, which by the time of release, could see a price cut to around $275-300. I mean I know people (foolishly) pay a premium for Intel, but an extra $200-250 for a slightly faster 8-cores would take the cake. 

The 7820X is a premium HEDT CPU. 9900K is mainstream desktop. They'll slap at most a $400 price tag on it and totally wipe out their newest HEDT CPUs which won't be hard as no-one bought them and they were already trying to pretend Skylake-X was a bad dream anyway.


----------



## EatingDirt (Aug 15, 2018)

bug said:


> Nope. Price is always absolute. The higher it it, the more stuff you give up buy one of these. One of anything really. That's why when for the price of one CPU you can get a (really) used car or pair of prosumer speakers, that price is a con.



Some people can buy a $1,000,000 house, some can only afford to rent. Everyone has different budgets.

At what point do you draw the line at price being a pro or a con? The answer is you can't, so you need to compare it to it's competition.



Shatun_Bear said:


> They can't price it more than $500 when it won't be noticeably faster than their competitors top 8-C 16-T 2700X, which by the time of release, could see a price cut to around $275-300. I mean I know people (foolishly) pay a premium for Intel, but an extra $200-250 for a slightly faster 8-cores would take the cake.
> 
> The 7820X is a premium HEDT CPU. 9900K is mainstream desktop. They'll slap at most a $400 price tag on it and totally wipe out their newest HEDT CPUs which won't be hard as no-one bought them and they were already trying to pretend Skylake-X was a bad dream anyway.



7820x was a budget HEDT CPU that had only 1 HEDT feature, quad channel memory. It was really just a budget entry so people could spend more $$ down the line on the x299 platform. 

There's a reason why intel suddenly started using the i9 moniker. It signifies their ultra-high performance line, and will come at a premium price. In fact you can see it in the name, *i7 *7820x. The processor after that was the *i9 *7900*. *Expect the i9's to be more expensive, they're clocked higher(rumor) than the 7820x, and the *only* thing they don't have that the 7820x has is quad channel memory. 

These i9's will be like Titan's from Nvidia. Super-enthusiasts will buy them even if they're priced at $500(9700k) & $550(9900k).

Of course you could be right and I could be wrong, we'll see eventually either way.


----------



## nemesis.ie (Aug 15, 2018)

lynx29 said:


> I had to go elsewhere to find min fps testing again... sigh... Intel destroys this by 10 fps across the board, yet again. Add in 20 more FPS when you remove Denovu from games (min fps not avg), and you are talking 30 fps min gain... smoothness and consistency... looks like Intel is still king for games.



Please use percentages rather than fps, fps are meaningless unless you know the base. 10fps when you are putting out 150 is vastly different to 10fps when you are putting out 30 for example.


----------



## bug (Aug 15, 2018)

EatingDirt said:


> Some people can buy a $1,000,000 house, some can only afford to rent. Everyone has different budgets.
> 
> At what point do you draw the line at price being a pro or a con? The answer is you can't, so you need to compare it to it's competition.



The answer is _you _can't. The cost of opportunity is the same for everyone.


----------



## TheLaughingMan (Aug 15, 2018)

EatingDirt said:


> Some people can buy a $1,000,000 house, some can only afford to rent. Everyone has different budgets.
> 
> At what point do you draw the line at price being a pro or a con? The answer is you can't, so you need to compare it to it's competition.



The point at which the price of a given item within the system begins to realistically restrict access to the produce. As you stated some people can afford an $1 million house. Those people are in a sub group because they have the income to put them in that position. Its called being rich. HEDT are named as such because both the item itself and the platform it is for cost enough to restrict some people from buying it.

In Wizz conclusion he pointed that out and recommended it for those who either run multi-threaded applications (like render engines) all day and/or use their computer to make money where the reduced time justifies the cost.


----------



## RealNeil (Aug 15, 2018)

Good review. I'd love to have one of these, but I already have the i9-7900X here and I can't justify getting the TR part.

It's good to see a certain amount of pressure being applied to Intel these days.
The CPU market is better for us now.


----------



## Space Lynx (Aug 16, 2018)

nemesis.ie said:


> Please use percentages rather than fps, fps are meaningless unless you know the base. 10fps when you are putting out 150 is vastly different to 10fps when you are putting out 30 for example.



I agree with this, and numbers I saw from Ryzen original launch, were nothing to laugh at. 30 fps to 40 fps min, in many games even at 1440p. avg fps doesnt matter at all at 1440p as they both equalize in everything except min fps. Ryzen 2 is a little better. if ryzen 3 gets even better its prob gonna be worth it to avoid intels crap once and for all. im just waiting for x570 mobos and gen 3


----------



## notb (Aug 16, 2018)

siluro818 said:


> It's literally ALL about serious work.


It's literally all about basic multimedia tasks (encoding / rendering), which BTW has to be one of the most boring niches for using a PC. :-D
To be honest, I don't really understand who Intel and AMD target with this marketing. Vlogers? One-man video businesses?
Because larger companies doing video editing will, obviously, use OEM workstations (or most likely: servers).

"With up to 18 cores and 36 threads the Intel® Core™ X-series processor family is our most powerful ever. Turn your PC into a studio: *produce amazing 4K or 360-degree videos, stunning photos, or amazing music*. This is the ultimate tool for* gaming and virtual reality* experience the power to do it all. "
https://www.intel.com/content/www/us/en/products/processors/core/x-series.html
You see? No science,, no engineering, no medical applications, no financial simulations. Video and gaming. A-w-e-s-o-m-e. :-D

Keep in mind encoding/rendering is a best case scenario for a slow high-core-count workstation. Essentially, one can do this with a GPU - granted it provides required instructions and libraries.
Generally, this is NOT how workstation-class PCs are used in real life.

Happily, I work on a VM at the moment and I don't have to care about the hardware at all (2C/4T :-D). But if someone decided to give me a powerful desktop, I'd easily go for the modern Xeons - simply because of the superior single-thread performance and how flexible they are.
2066 and TR4 are not even business-oriented platforms. Putting aside the gaming-oriented Area-51, AFAIK you won't find them in top3 OEMs' catalogues. 


> You're like that other guy who has made up his own definition of HEDT, but for "workstation", cherry-picking certain words and ending up convincing himself that the end result is some kind of holy gospel...


"HEDT" is an artificial, weird and poorly defined term. And I hate it, honestly. It's an insult to the machines that let us do all those great things.
People spend so much time thinking if their PCs are HEDT-enough (e.g. benchmarking), they don't have time to use it.


----------



## siluro818 (Aug 16, 2018)

That's the thing though - tech like that opens new possibilities for markets that didn't exist before. A lot of computation-heavy work is outsourced to freelancers today, often by those same companies which would buy the OEMs, while places which previously had trouble putting together the hardware they need to build clusters for such tasks now can pull it off with a single workstation. It's not all about the big corporate clients. Science, engineering, and medical simulations go to the server farms. The new multi-core CPUs are in fact targeting private professionals. Media production might seem boring to you, when compared to the grand science, but it actually makes the world go round to a large extent. People consume it incessantly and the growth ain't gonna stop soon.


----------

