# 3930k ->7800X worth it?



## BarbaricSoul (Sep 12, 2017)

Thinking about possibly upgrading my 3930K system to a 7800X system. System is mostly used for crunching with some minor gaming every now and then. My only concern is is it actually worth it? How much of a performance gain will the upgrade yield? The 3930k has served me very well, and still does what I ask of it, but is there enough of a performance increase between the 3930k and 8700X to justify the cost of the upgrade? The upgrade would cost about $1000 including a much needed and overdue SSD.

https://pcpartpicker.com/list/GDRMsJ

And before anyone suggests it, $600 for a 8720X is out of my budget.


----------



## JalleR (Sep 12, 2017)

If you are looking at the Price performance gain I would wait till the release of the i7-8700(k) to make that diction


----------



## BarbaricSoul (Sep 12, 2017)

with all due respect, I'm not asking about the 8700k. I'm asking about the 7800X.


----------



## Vayra86 (Sep 12, 2017)

Well it is pre-Haswell so this does mean there is a chunk of IPC you're missing. At the same time you have several Intel gens AND TR available so for a HEDT upgrade the time seems quite right, price points have shifted a bit due to competition as well. Is it worth it.. only if you max out your current rig doing what you do. Performance  is only useful if its usable  A lot will depend on clocks, I reckon we're talking about a 20% IPC gap altogether, give or take.


----------



## BarbaricSoul (Sep 12, 2017)

Vayra86 said:


> Is it worth it.. only if you max out your current rig doing what you do. Performance  is only useful if its usable



Crunching runs my CPU at it's maximum output. So yeah, I do use every bit of performance the CPU has to offer.


----------



## Vayra86 (Sep 12, 2017)

If this is all-core 3.8 Ghz like it shows here (although with my rig that cpu clock never is correct, somehow I can run my ivy at 5.4 Ghz according to Win 10 ) then there is a good bit of performance to be had, along with the IPC gap that'll amount to 35-50% if you put a decent OC on there.


----------



## BarbaricSoul (Sep 12, 2017)

Vayra86 said:


> If this is all-core 3.8 Ghz like it shows here (although with my rig that cpu clock never is correct, somehow I can run my ivy at 5.4 Ghz according to Win 10 ) then there is a good bit of performance to be had, along with the IPC gap that'll amount to 35-50% if you put a decent OC on there.



it's all cores running at 3.8 GHz. Can do more, but then my room starts to get too hot to sleep in.


----------



## EarthDog (Sep 12, 2017)

Plenty of benchmarks out there to give you an idea of performamce. Then once you know how it performs, you can decide if its 'worth it'.
http://www.anandtech.com/show/11550...-core-i9-7900x-i7-7820x-and-i7-7800x-tested/9

I know, with respect, you said 7800x... why not wait for 8700k? Its a hex... and likely coming in cheaper than 7800x as well as lerdorm better in games.


----------



## BarbaricSoul (Sep 12, 2017)

EarthDog said:


> I know, with respect, you said 7800x... why not wait for 8700k? Its a hex... and likely coming in cheaper than 7800x as well as lerdorm better in games.



plans for future upgrade involving buying a used 10-12 core


----------



## Outback Bronze (Sep 12, 2017)

I say go for it mate.

The board alone will give you a few upgrades with M.2 etc.

I would stretch for the 7820X for a more real feel of an upgrade up front but as you say, you can buy a used one down the track. So no qualms there.

You got the itch? Scratch it


----------



## jagjitnatt (Sep 12, 2017)

BarbaricSoul said:


> Thinking about possibly upgrading my 3930K system to a 7800X system. System is mostly used for crunching with some minor gaming every now and then. My only concern is is it actually worth it? How much of a performance gain will the upgrade yield? The 3930k has served me very well, and still does what I ask of it, but is there enough of a performance increase between the 3930k and 8700X to justify the cost of the upgrade? The upgrade would cost about $1000 including a much needed and overdue SSD.
> 
> https://pcpartpicker.com/list/GDRMsJ
> 
> And before anyone suggests it, $600 for a 8720X is out of my budget.



Not worth it. 
If you think that $600 is out of budget, and you're sleeping in the same room as the PC, it clearly is not justified to spend extra money on it.
We're talking CPU + Memory (which is expensive right now) + Motherboard (also expensive) + SSD (might not be on the radar, but you will get an itch soon).

That's too much ($800-$1000) for at best 20% extra performance


----------



## GorbazTheDragon (Sep 12, 2017)

When considering the 7800x in isolation, I would have given a big fat *yes*. 

If you can say based on your use case that X299>Z370 due to the PCIe/memory/etc...
It's the most cores/price you can get on intel's stack, on par with the expected pricing of the 8700k.

But, considering that you are only moving up from the 3930k, I would have to change that to a no. You'd be looking at a ~$600 investment for nothing more than a not very impressive performance jump and the extra expansion options (m.2+newer chipset). Maybe if you were eyeing the 8-core I would say yes in all cases, but 3930k=>7800x just looks like a mega expensive sidegrade.


----------



## Rehmanpa (Sep 20, 2017)

Why had nobody brought up ryzen? His CPU will be using heavily multithreaded applications, so a ryzen 1700 with 4 more threads than the 7800x wou ld be quite benefiitial plus its tdp is 65w vs 140w on the 7800x and not to mention it's cheaper and the motherboards for it are cheaper too. I mean with the extra threads that would be a significant upgrade imo. He sleeps in the same room as the pc so the lower the cpu tdp the better.


----------



## Peter Lindgren (Sep 20, 2017)

I´m happy with my Xeon 2680v2 running 10 cores at 3,5Ghz. You could do the same if your motherboard supports Xeons. They cost about 200$ at Ebay.


----------



## ikeke (Sep 21, 2017)

+1 @Peter Lindgren

Get a used E5-2697 v2 xeon from Ebay? That'll give you more bang for your buck, will slaughter the newer i7 in crunching.

I had 14c/28t xeon in my daily driver last year, and 2 pairs of 16c/32t and 18c/36t in crunchers.
http://valid.x86.fr/l1235p
http://valid.x86.fr/86t9wz
http://valid.x86.fr/uuhsyn

(also, thumbs up for R7 1700, i "downgraded" to it since its only 65W - you can cool it almost passively)


----------



## jaggerwild (Sep 21, 2017)

My 3930k clocks past 5GHz, no reason to upgrade. I have a 6700k, i always go back to the 3930k. Use the 6700k for board testing


----------



## BarbaricSoul (Sep 21, 2017)

jaggerwild said:


> My 3930k clocks past 5GHz, no reason to upgrade. I have a 6700k, i always go back to the 3930k. Use the 6700k for board testing



My 3930k isn't that good of a OC'er. Best I've gotten out of this particular chip is 4.5GHz. And since it runs at 100% load crunching whenever the system is powered up, anything past 3.8 GHz generates enough heat to actually start increasing the room temperature. I may go Ryzen just because of this.


----------



## EarthDog (Sep 21, 2017)

So you can crunch slower using a similar amount of power when overclocked at 4Ghz, its limit?


----------



## ikeke (Sep 21, 2017)

IIRC then 3930k @ 4.5...5Ghz is right up there with OCd AMD FX8xxx and FX9xxx on the power consumption front (~300W+ on CPU)

My R7 1700 @stock clocks and undervolted by 100mv is ~60W regular and 80W 100% AVX load. I dont see much value in pushing for silicon limit by dumping vcore into a chip, 3.5-3.6Ghz all core can be achieved by marginal overvolt or (Si-lottery permitting) at stock.

https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/


----------



## GorbazTheDragon (Sep 21, 2017)

BarbaricSoul said:


> My 3930k isn't that good of a OC'er. Best I've gotten out of this particular chip is 4.5GHz. And since it runs at 100% load crunching whenever the system is powered up, anything past 3.8 GHz generates enough heat to actually start increasing the room temperature. I may go Ryzen just because of this.


The 8 core Ryzens also should be great, but again you are just looking at a pretty expensive sidegrade. $450 before RAM at least!!!


----------



## thebluebumblebee (Sep 21, 2017)

BTW, @Norton and @infrared are crunching with Ryzens.  You can check out their results at FDC.  Here's Norton01's 1600X


----------



## jaggerwild (Sep 21, 2017)

These people get paid by AMD, or they have something to gain by consistantly shoving AMD up everyones azz?


----------



## ikeke (Sep 21, 2017)

..or perhaps it's just a good well-priced product?


----------



## thebluebumblebee (Sep 21, 2017)

Ryzen is a god send for crunchers.  Many threads, fast and efficient.

Edit: And I forgot AFFORDABLE.


----------



## Rehmanpa (Sep 22, 2017)

jaggerwild said:


> These people get paid by AMD, or they have something to gain by consistantly shoving AMD up everyones azz?


Ryzen is objectively better than intel current line offer ups for HEDT. 
Lower power consumption, very similar ipc, way lower price (especially with mobos taken into consideration), plus did I mention it's cheaper? Problem with it on a gaming factor is that people often compare it to an i7 7700k with both using a 1080 ti. The fact of the matter is that they ought to compare the ryzen 1600 with a 1080 ti vs a 7700k with a 1080 because it's about the same price at that point. 
Tl;dr: quit hating on AMD for making a good product.


----------



## lyndonguitar (Sep 22, 2017)

Rehmanpa said:


> Ryzen is objectively better than intel current line offer ups for HEDT.
> Lower power consumption, very similar ipc, way lower price (especially with mobos taken into consideration), plus did I mention it's cheaper? Problem with it on a gaming factor is that people often compare it to an i7 7700k with both using a 1080 ti. The fact of the matter is that they ought to compare the ryzen 1600 with a 1080 ti vs a 7700k with a 1080 because it's about the same price at that point.
> Tl;dr: quit hating on AMD for making a good product.


Good points, i like ryzen as well, but generally people tend to keep the CPU for longer times that GPUs, as they age better than GPUs. so they kinda want to prioritize the CPU to last longer ,and for gaming, 7700k will last longer than the 1600. people are more likely to buy the 7700k now with the 1080. then upgrade to a better GPU later. than get the Ryzen with the 1080ti now, then get a better Ryzen later.


----------



## Rehmanpa (Sep 22, 2017)

lyndonguitar said:


> Good points, i like ryzen as well, but generally people tend to keep the CPU for longer times that GPUs, as they age better than GPUs. so they kinda want to prioritize the CPU to last longer ,and for gaming, 7700k will last longer than the 1600. people are more likely to buy the 7700k now with the 1080. then upgrade to a better GPU later. than get the Ryzen with the 1080ti now, then get a better Ryzen later.


How will an i7 7700k last longer than a 1600? If anything, the extra cores from the 1600 will ensure that IT lasts longer. The 1080 ti would provide better fps than the 7700k and the 1600 would have better multithreaded rendering performance meaning that the system is faster In heavily cpu threaded applications and gaming for the same price. Idk why you think the 7700k will last longer. It's always been the whole, more cores = more future proof. The IPC difference is very small and getting less and less of an issue with software updates and game optimization.


----------



## Vayra86 (Sep 22, 2017)

lyndonguitar said:


> Good points, i like ryzen as well, but generally people tend to keep the CPU for longer times that GPUs, as they age better than GPUs. so they kinda want to prioritize the CPU to last longer ,and for gaming, 7700k will last longer than the 1600. people are more likely to buy the 7700k now with the 1080. then upgrade to a better GPU later. than get the Ryzen with the 1080ti now, then get a better Ryzen later.



A 7700k doesn't last longer than a Ryzen at all. TODAYS 7700k performance *at 1080p and 1440p only* over a Ryzen at 4 Ghz, which also means delidding and voiding your warranty on the Intel CPU, is a meagre 10% in a best case scenario until you hit a GPU wall anyway.

That 10% best case, super situational advantage really isn't worth it at all. The 7700k is a niche product, specifically for the high refresh rate 1080p gamer if you ask me. For everyone else, including gamers, Ryzen is a much safer buy, or the upcoming 8700k. The additional cores will pay off in some engines already, and this will only increase over the lifetime of a CPU you buy today. You stand to gain far more than 10%, regardless of resolution and without voiding warranty.

You can't realistically say in the same comment that A. your CPU lasts much longer and B. then proceed to void warranty to extract the performance to make the same CPU worthwhile over competing products. It just doesn't make any sense at all.


----------



## dj-electric (Sep 22, 2017)

The future of X299 and LGA2066 IMO is foggy.
The deal is that now getting a 7800X is of course a bad decision if there are no plans to upgrade further. That's why you have the 8700K.
If upgrades are in the future, than hopefully next year's 10+ cores won't be so utterly expensive, as 8 cores will enter the mainstream platform.

If a 10 Core is 999$ today, maybe the 10 core of next year will be at the 699$ range, hopefully.


----------



## Mirkoskji (Sep 22, 2017)

Are you counting the motherboard upgrade factor? You can say your 7700k lasts more maybe. But then you have to change motherboard an CPU when you upgrade. With ryzen there are two to three generations that will run on the same platform. It can be a game changer considering the economic factor


----------



## EarthDog (Sep 22, 2017)

Rehmanpa said:


> How will an i7 7700k last longer than a 1600? If anything, the extra cores from the 1600 will ensure that IT lasts longer. The 1080 ti would provide better fps than the 7700k and the 1600 would have better multithreaded rendering performance meaning that the system is faster In heavily cpu threaded applications and gaming for the same price. Idk why you think the 7700k will last longer. It's always been the whole, more cores = more future proof. The IPC difference is very small and getting less and less of an issue with software updates and game optimization.


true, but again, most people dont heavily thread anything. 8t is fine for the next few years. Only IF you use those threads and when more uses it. Weve been saying quad for nearly 10 years now... and finally a quad is worth it... its going to be a few more uears bwfore anything over 8t is usefil to the majority.



Vayra86 said:


> A 7700k doesn't last longer than a Ryzen at all. TODAYS 7700k performance *at 1080p and 1440p only* over a Ryzen at 4 Ghz, which also means delidding and voiding your warranty on the Intel CPU, is a meagre 10% in a best case scenario until you hit a GPU wall anyway.
> 
> That 10% best case, super situational advantage really isn't worth it at all. The 7700k is a niche product, specifically for the high refresh rate 1080p gamer if you ask me. For everyone else, including gamers, Ryzen is a much safer buy, or the upcoming 8700k. The additional cores will pay off in some engines already, and this will only increase over the lifetime of a CPU you buy today. You stand to gain far more than 10%, regardless of resolution and without voiding warranty.
> 
> You can't realistically say in the same comment that A. your CPU lasts much longer and B. then proceed to void warranty to extract the performance to make the same CPU worthwhile over competing products. It just doesn't make any sense at all.


You dont have to delid to reach 5ghz. Some chips do though.

That said, ill buy into more slightly slow cores in a few years when its needed. Most users and games dont perform better with more cores and wont for hears.  Until then, i prefer the faster chip for my, and most people's uses.



Mirkoskji said:


> Are you counting the motherboard upgrade factor? You can say your 7700k lasts more maybe. But then you have to change motherboard an CPU when you upgrade. With ryzen there are two to three generations that will run on the same platform. It can be a game changer considering the economic factor


the problem with that is missing out on new technology... id rather upfrade my board every 3 years with my professor than to have it last 6 years an be missing out on the latest tech. 

Some dont care about that though.


----------



## Durvelle27 (Sep 22, 2017)

What about Threadripper great IPC. lanes, and features with more cores


----------



## Mirkoskji (Sep 22, 2017)

EarthDog said:


> the problem with that is missing out on new technology... id rather upfrade my board every 3 years with my professor than to have it last 6 years an be missing out on the latest tech.
> 
> Some dont care about that though.


Next technological advances are mostly pcie 4.0 or 5.0 and ddr5. Maybe pcie don't even need a rewiring in the motherboard.


----------



## BarbaricSoul (Sep 22, 2017)

Keep the conversation going. I'm enjoying you guys weighing the options between the different available platforms. Let me state a couple factors that are relevant to my possible upgrade.

1. My computer's main use is crunching. In that, I use every available core, thread, and GHz available from the CPU.
2. I'm looking to keep the upgrade below $1100 USD. I plan on doing the upgrade in the next month or two depending on how money goes
3. The computer is set up in my bedroom, so heat generation is definitely a factor.
4. I currently share a house with two friends, so the power bill is split three ways. Electrical usage is a non-factor.
5. I don't have a preference for Intel over AMD. I'm about bang for buck, but am willing to splurge a little for better performance.
6. I originally was set on the 7800X because future upgrade plans involving buying a used 10 core 2-3 years down the road, hence my lack of interest in the 8700k.

So with that said, please continue.


----------



## Durvelle27 (Sep 22, 2017)

BarbaricSoul said:


> Keep the conversation going. I'm enjoying you guys weighing the options between the different available platforms. Let me state a couple factors that are relevant to my possible upgrade.
> 
> 1. My computer's main use is crunching. In that, I use every available core, thread, and GHz available from the CPU.
> 2. I'm looking to keep the upgrade below $1100 USD. I plan on doing the upgrade in the next month or two depending on how money goes
> ...


https://pcpartpicker.com/list/DRkJcc

12C/24T
16GB of 3200MHz
and a beast board


Edit: TR outperforms the current 7 series HEDT by a decent margin in multi-threaded apps while using less power and running cooler. It's IPC is also just about where Skylakes is


----------



## jaggerwild (Sep 22, 2017)

Intel motherboards are far superior.....


----------



## Durvelle27 (Sep 22, 2017)

jaggerwild said:


> Intel motherboards are far superior.....


Dude where

in imagination land


----------



## ikeke (Sep 22, 2017)

With TR you really do need to fill all four memory channels, otherwise you'll starve the cpu.

You could also look at Epyc 7281 and Supermicro workstation board (H11SSL-i / H11SSL-C / H11SSL-NC).


----------



## GorbazTheDragon (Sep 22, 2017)

Durvelle27 said:


> https://pcpartpicker.com/list/DRkJcc
> 
> 12C/24T
> 16GB of 3200MHz
> ...



I'm also on the TR train for this one. It's the one that makes most sense in terms of IPC. You lose out on some gaming performance but given @BarbaricSoul 's needs I'd say that is a non-factor. Upgrading to a 12c makes much more sense than the <10c chips.

The only thing I would say is to wait a few more months until there are more options in terms of boards and the drivers and reviews paint a more complete picture, especially given that this is more of a long-term investment.


----------



## Durvelle27 (Sep 22, 2017)

GorbazTheDragon said:


> I'm also on the TR train for this one. It's the one that makes most sense in terms of IPC. You lose out on some gaming performance but given @BarbaricSoul 's needs I'd say that is a non-factor. Upgrading to a 12c makes much more sense than the <10c chips.
> 
> The only thing I would say is to wait a few more months until there are more options in terms of boards and the drivers and reviews paint a more complete picture, especially given that this is more of a long-term investment.


The Tachi is actually one of the best boards available both TR4 & AM4


----------



## Rehmanpa (Sep 22, 2017)

Durvelle27 said:


> What about Threadripper great IPC. lanes, and features with more cores


That's what I plan  my next build to be. Price/performance over intel is absolutely mindblowing, and the PCIE lanes are awesome (NVME raid boot ftw).
edit: I would get the Asrock professional gamer motherboard, it's currently on sale on newegg and it also has 10gb ethernet (which is an expensive addon card for when you want it later, and then it's not taking up one of your precious pcie slots )



GorbazTheDragon said:


> I'm also on the TR train for this one. It's the one that makes most sense in terms of IPC. You lose out on some gaming performance but given @BarbaricSoul 's needs I'd say that is a non-factor. Upgrading to a 12c makes much more sense than the <10c chips.
> 
> The only thing I would say is to wait a few more months until there are more options in terms of boards and the drivers and reviews paint a more complete picture, especially given that this is more of a long-term investment.


The IPC loss over say an i7 7700k is neglitable unless you're playing at 1080p. Since I'm currently playing at a combination of 1600p/1440p, and planning on upgrading to a 3440x1440 ultrawide, it will be hardly noticeable. So it really depends on your resolution. Threadripper is more than capable of gaming, and with all of the cost saving vs Intel's competing products and pure efficiency (180w tdp for a 32 threaded processor, now that's efficient), it just makes sense if you're willing to dish out the $$$. This whole "pure gaming then get intel" crap gets old after awhile, who want's to ONLY ever game? I mostly do gaming, but the times I do video-rendering, or say video recording and gameplay, or streaming and gameplay, or hosting a server while playing a game, I am hardly ever "just doing pure gaming." 

Sorry if that only make sense in my head, kinda tired writing all this lol


----------



## EarthDog (Sep 22, 2017)

If you can use the cores, its awesome!


----------



## drade (Sep 22, 2017)

Don't mind if I chime in here. @barabaricsoul , I am upgrading in the next 1-2 months as well. My budget is similar to yours without a GPU, ~1300$. At first I was leaning towards Ryzen. Performance and value is there with Ryzen, new bios and updates are allowing more stability and OC headroom with the current AM4 platform, the additional cores are there for multitasking. My plan was to get the 1700x and attempt a 4.0-4.1 ghz OC. Another member here mentioned the ability to upgrade future CPU generations on the same AM4 platform. If this is true, the value is set in stone with Ryzen long term.

However, a few months ago rumors of coffee lake started surfacing. Now I am leaning towards the 8700k. The price is about ~400$, so yes it is expensive, more expensive than the previous generation. However, the OC capabilities and stability of the platform is well established. I will wait for release and true reviews/benchmark results. If heat generation at higher clocks is not overwhelming compared to last generation I will most likely buy an 8700k with a nice z370 mobo. However, the platform ends with coffee lake. Intel has been forced by AMD to conform to new standards in the innovation segment. This is good for consumers. So thank you AMD.

If your main goal is crunching, I would suggest the possibility of 8700k vs. Ryzen at this time. I believe the future of Zen will provide consumers with more OC potential, lower temps, lower power consumption, and improved stability.


----------



## BiggieShady (Sep 22, 2017)

EarthDog said:


> true, but again, most people dont heavily thread anything. 8t is fine for the next few years. Only IF you use those threads and when more uses it. Weve been saying quad for nearly 10 years now... and finally a quad is worth it... its going to be a few more uears bwfore anything over 8t is usefil to the majority.


Finally the time has come when you can produce a game build in any current engine (even Unity) that has multi threaded rendering in use on as many threads available out of the box. Too bad that's not the only condition one has to meet for good parallelization. Rendering a scene using many draw calls in a multi threaded way is kinda asymmetrical because of all the overheads. One thread gets to be the heavy one that sorts and pumps the draw calls. Yes, the thing that DX12 tries to mitigate with less overhead.
Only, solution is stupidly simple ... draw a scene using one fucking draw call ... yes, opengl supports this (don't know about dx, command lists are similar) ... you use the gpu as usual, issue draw calls but instead of pinging back-forth over pci-e with driver and api overheads on each call, you actually build a buffer you can issue at gpu one time per frame as a one huge draw call for the entire scene. The act of building up the buffer can be perfectly parallelized, and a single draw call makes no thread heavier than others.
Seems like a digression but it's not, because adoption of these techniques will make multi core behemoths very interesting for gaming soon. It may be off topic though


----------



## thebluebumblebee (Sep 22, 2017)

EarthDog said:


> id rather upfrade my board every 3 years with my *professor*


Whoa dude, TMI.


----------



## Aquinus (Sep 22, 2017)

BarbaricSoul said:


> 3. The computer is set up in my bedroom, so heat generation is definitely a factor.
> 4. I currently share a house with two friends, so the power bill is split three ways. Electrical usage is a non-factor.


For what it's worth, I think these are two contradictory statements.  #4 is not even a consideration because power usage is directly proportional to heat generated.

I would wait to see what happens in the next couple of months if you're not in a rush. The 3930k is definitely a CPU that can run hot and suck down power though. I was astonished at how much more the 3930k draws over the 3820.

Edit: If your crunching, I wouldn't be surprised if threadripper proves to be the better perf:cost option though.


----------



## jaggerwild (Sep 22, 2017)

Durvelle27 said:


> Dude where
> 
> in imagination land


Yeah,
 Cause I noticed you do not mention the Advantage of an AMD board, just yer opinion which is subject to croud approval. Talk all day, you'll never get a AMD CPU to clock out as high as an intel. Wow so you get more cores for yer buck, I'm not a fan boy I'm a realest.
 I understand its cheap to use an AMD, but with all the issue's. Do they use dual rate memory yet? Or have IGPU? I know the boards don't over clock good yet, as I been doing my reading.


----------



## thebluebumblebee (Sep 22, 2017)

jaggerwild said:


> Or have IGPU?


Well, you got us there.


----------



## jaggerwild (Sep 23, 2017)

Or the Thread ripper dummy dies..........


----------



## GorbazTheDragon (Sep 23, 2017)

Rehmanpa said:


> The IPC loss over say an i7 7700k is neglitable unless you're playing at 1080p. Since I'm currently playing at a combination of 1600p/1440p, and planning on upgrading to a 3440x1440 ultrawide, it will be hardly noticeable. So it really depends on your resolution. Threadripper is more than capable of gaming, and with all of the cost saving vs Intel's competing products and pure efficiency (180w tdp for a 32 threaded processor, now that's efficient), it just makes sense if you're willing to dish out the $$$. This whole "pure gaming then get intel" crap gets old after awhile, who want's to ONLY ever game? I mostly do gaming, but the times I do video-rendering, or say video recording and gameplay, or streaming and gameplay, or hosting a server while playing a game, I am hardly ever "just doing pure gaming."
> 
> Sorry if that only make sense in my head, kinda tired writing all this lol



I really don't want to get dragged down this i7 vs r5/r7 argument. It's already a known factor, and those who run into single core bottlenecking like I have will know about it, but this thread asked about number crunching so just as you are hinting, why even bother mentioning the mainstream chips? Especially the intel ones...


jaggerwild said:


> Or the Thread ripper dummy dies..........


----------



## Vya Domus (Sep 23, 2017)

BiggieShady said:


> Only, solution is stupidly simple ... draw a scene using one fucking draw call ... yes, opengl supports this (don't know about dx, command lists are similar) ... you use the gpu as usual, issue draw calls but instead of pinging back-forth over pci-e with driver and api overheads on each call, you actually build a buffer you can issue at gpu one time per frame as a one huge draw call for the entire scene. The act of building up the buffer can be perfectly parallelized, and a single draw call makes no thread heavier than others.



Command lists for DirectX work kind of backwards. You issue the draw calls in a buffer which can then be execute later , building up the buffer is single threaded but when you later execute each command you can do them simultaneously each with it's own thread. It's probably trickier but if you do it carefully you can constantly execute commands in parallel.



jaggerwild said:


> Yeah,
> Cause I noticed you do not mention the Advantage of an AMD board, just yer opinion which is subject to croud approval. Talk all day, you'll never get a AMD CPU to clock out as high as an intel. Wow so you get more cores for yer buck, I'm not a fan boy I'm a realest.
> I understand its cheap to use an AMD, but with all the issue's. Do they use dual rate memory yet? Or have IGPU? I know the boards don't over clock good yet, as I been doing my reading.



Uhm , you do know that how much AMD/Intel chips OC has nothing to do with the motherboards , right ? It's the silicon which dictates these limits.

Also , what's "dual rate memory" ? You mean DDR ?


----------



## jaggerwild (Sep 23, 2017)

Ryzen wasn't running Dual rate, also the VRM on the new boards.

 Vya,
 Put yer money where your mouth is, A ROG board will overclock better then a lower class board. Do I need to school you?

 Update Ryzen couldn't run Quad channel memory sorry my bad. I'm posting reasons why Ryzen might not be the best choice. Also Thread ripper, with its massive foot print.


----------



## Rehmanpa (Sep 23, 2017)

jaggerwild said:


> Ryzen wasn't running Dual rate, also the VRM on the new boards.
> 
> Vya,
> Put yer money where your mouth is, A ROG board will overclock better then a lower class board. Do I need to school you, didn't Toms Hardware teach you anything?


Dude what's your problem? You've added literally nothing productive to this thread.


----------



## HammerON (Sep 23, 2017)

Alright. Chill out folks. Be respectful and agree to disagree if need be.
Sometimes it is best just to ignore some posts...


----------



## R-T-B (Sep 23, 2017)

jaggerwild said:


> Yeah,
> Cause I noticed you do not mention the Advantage of an AMD board



Nor did you mention the advantage of an Intel one.

I think he means Dual Rank memory, at any rate.  And yes, it works.


----------



## BiggieShady (Sep 23, 2017)

Vya Domus said:


> Command lists for DirectX work kind of backwards. You issue the draw calls in a buffer which can then be execute later , building up the buffer is single threaded but when you later execute each command you can do them simultaneously each with it's own thread. It's probably trickier but if you do it carefully you can constantly execute commands in parallel.


Interesting. Now when I think of it, OpenGL solution I was referring to also uses command lists but with token buffers, and tokens are simple structures in linear memory block - you can prepare that buffer as you please (bindless textures and objects are a must):



 

 

It's crazy how pc graphics api-s are used sub-optimally in practice.
Anyways, here's the whole slideshow: 
https://www.slideshare.net/tlorach/opengl-nvidia-commandlistapproaching-zerodriveroverhead
https://www.slideshare.net/CassEveritt/approaching-zero-driver-overhead

These techniques may allow threadrippers to saturate 4 titans in, for example, 360 degrees projective vr setup ... or 8K game rendering with enourmous amount of dynamically animated clutter on the scene.


----------



## EarthDog (Sep 23, 2017)

Vya Domus said:


> Uhm , you do know that how much AMD/Intel chips OC has nothing to do with the motherboards , right ? It's the silicon which dictates these limits.


Sort of. Put a shit board on either and they wont reach the same clocks as with a good board. And this is only true the last gen for amd, and past couple for intel. Prior to that, the board made a HUGE difference for both.

Id also be concerned now with x299 boards limiting intel cpus...and potential TRipper overclocking as well. So... there's that... 

To say it has "nothing" to do with it is a poor choice of words.


----------



## Aquinus (Sep 23, 2017)

Vya Domus said:


> Uhm , you do know that how much AMD/Intel chips OC has nothing to do with the motherboards , right ? It's the silicon which dictates these limits.


I assure you that motherboard does make a difference on how far you can push a CPU. I'm sure that @cadaveca would agree with that assessment.


Vya Domus said:


> Also , what's "dual rate memory" ? You mean DDR ?


I think that was a typo for dual *rank* memory.


----------



## GorbazTheDragon (Sep 23, 2017)

It really depends on how far you are pushing the chips. If you are under full water cooling and hammering the clocks, I expect you will find a pretty big difference between a lot of the boards.

As you say this is more of a problem with X299 and X399, but again it comes down to how much you are cooling your chip and how far you are pushing it.

On a sidenote, I actually suggested to a certain reviewer that he bust out his water cooling skills for reviewing the SKL-X and TR boards, but to my surprise he was very much against it to say the least...


----------



## Vya Domus (Sep 23, 2017)

EarthDog said:


> Sort of. Put a shit board on either and they wont reach the same clocks as with a good board. And this is only true the last gen for amd, and past couple for intel. Prior to that, the board made a HUGE difference for both.
> 
> Id also be concerned now with x299 boards limiting intel cpus...and potential TRipper overclocking as well. So... there's that...
> 
> To say it has "nothing" to do with it is a poor choice of words.



I said that the limit of how far you can push these chips is down to the silicon itself, is that incorrect ? You can have the best board in the world with a million phases it wont get past what the chip can do.

Of course on lower end board you wound't be able to reach that maximum OC , but that *is *ultimately the limit these CPUs have.


----------



## EarthDog (Sep 23, 2017)

Boards matter... 

You will not reach the silicon's limit without a proper board.


----------



## cadaveca (Sep 23, 2017)

GorbazTheDragon said:


> On a sidenote, I actually suggested to a certain reviewer that he bust out his water cooling skills for reviewing the SKL-X and TR boards, but to my surprise he was very much against it to say the least...



You need at least a good 280mm radiator on an AIO for X299. Air cooling doesn't cut it. Both platforms should not be reviewed with anything less than that. Full-on watercooling costs money, good money, and to invest in proper watercooling for a review... I wouldn't do it either.

But I'm the guy that actually watches CPU power consumption. Maybe the only one. 



EarthDog said:


> Boards matter...
> 
> Yoy will not reach the silicon's limit without a proper board.



Why not just come out and say that there are many boards with a 275W-285W power use limit? Oh, it's too early.


----------



## Vya Domus (Sep 23, 2017)

EarthDog said:


> Boards matter...
> 
> You will not reach the silicon's limit without a proper board.



And how is that different from what I said ?


----------



## cadaveca (Sep 23, 2017)

Vya Domus said:


> And how is that different from what I said ?


It's not just "with a lower-end board".

It's more like "if you don't buy the most expensive boards the board is the limit"...


----------



## GorbazTheDragon (Sep 23, 2017)

cadaveca said:


> You need at least a good 280mm radiator on an AIO for X299. Air cooling doesn't cut it. Both platforms should not be reviewed with anything less than that. Full-on watercooling costs money, good money, and to invest in proper watercooling for a review... I wouldn't do it either.



The AIOs won't take the chips past 250w or so, I'd say you really see the differences between boards when you are going up the 350-400w mark based on what der8auer seemed to be pulling.

Said reviewer was renowned for his custom water cooling builds and mods, so I hardly think it's a huge investment considering he has a lot of the kit already and would be one of the very few reviewers that would be using a water loop.


----------



## cadaveca (Sep 23, 2017)

GorbazTheDragon said:


> The AIOs won't take the chips past 250w or so, I'd say you really see the differences between boards when you are going up the 350-400w mark based on what der8auer seemed to be pulling.



Uh, yeah, that guy. Not listening to anything that comes from his direction. The platform has a base spec of 300W. OF course you can push that high. The real question is which CPUs allow this? Not really any you can buy right now... my 7900X chips pull just north of 200W @ 4.5 GHz, (or just south, depending on which CPU you ask about). He's quoting numbers that are double... perhaps that is only for benchmark use? I will remind you that this guy is featured in ASUS's realbench benchmark video... so we can clearly know how he uses his PCs.


BTW, I have no issues breaking 300W on my AIOs.

I do have/had many of the CPUs for this platform, and I do not agree with this particular person's position on the platform that seem to claim as gospel.



GorbazTheDragon said:


> Said reviewer was renowned for his custom water cooling builds and mods, so I hardly think it's a huge investment considering he has a lot of the kit already and would be one of the very few reviewers that would be using a water loop.



Meh. As a reviewer, it is most important to create review situations that match your user base. The majority of enthusiasts cannot afford HEDT in the first place, nevermind HEDT + high-end water. Even if I had blocks for the platforms (I have the rest of the kit), I still wouldn't use it for reviews.


----------



## GorbazTheDragon (Sep 23, 2017)

cadaveca said:


> Meh. As a reviewer, it is most important to create review situations that match your user base. The majority of enthusiasts cannot afford HEDT in the first place, nevermind HEDT + high-end water. Even if I had blocks for the platforms (I have the rest of the kit), I still wouldn't use it for reviews.


I feel like he's lost a lot of his viewerbase by moving away from the cases and modding stuff, that was the main reason I started watching his content. Unfortunate, but I won't make a fuss about it.


----------



## jaggerwild (Sep 23, 2017)

Vya Domus said:


> And how is that different from what I said ?



 Keep quoting what you read, its not what you know clearly(only what you have read). You post dis information on the net expect to get called out on it!!


----------

