# Skylake i7 6700K lose to Haswell i7 4790K in gaming?



## trodas (Aug 6, 2015)

According to the preliminary (NDA) and pulled down tests, Skylake is good for syntetic benches (109% over i7 4790K):




...but it sadly and surprisingly lose (4 tp 6%) in gaming tests to i7 4790K:






Sure, Anandtech is trying to cover this debacle, using an i7 4770K to unfair comparsion and ComputerBase tweaked the results to make Skylake look good (1% faster that i7 4790K) by throving syntetic benches in the mix, but gaming tests show that i7 4790K wins.

Only in Total War: Attila Skylake with GTX 980 Ti combination wins slightly faster result.






At least the TDP get from 95W lovered to 91W (only marketing change?). OTOH the asking price will be 350$ for i7 6700K and i5 6600K will be for 243$.

http://wccftech.com/intel-unleashes...5-6600k-processors-91w-z170-platform-support/
http://wccftech.com/intel-skylake-core-i7-6700k-cpu-review-processor-gaming-performance/
http://www.computerbase.de/2015-08/intel-core-i5-6600k-i7-6700k-test-benchmark-skylake/


----------



## Jetster (Aug 6, 2015)

Yep. So really not worth any upgrade unless from AMD or Sandy


----------



## RejZoR (Aug 6, 2015)

It's not worth in ANY case. Have you seen the price of 6700k ? They want 400 EUR for this crap. I paid way less for i7 920 back in the day and it was enthusiast grade CPU/platform. Intel thinks their crap GPU is worth such premium? Heh... and it's not even hexacore. Total disappointment. I was expecting a price between 250-300 €, not freaking 400 €.

I'll just wait for AMD Zen as I originally planned. Or Skylake-E, hopefully without the stupid GPU. Got myself a new case to let components breathe a bit easier and that will be it.


----------



## happita (Aug 6, 2015)

RejZoR said:


> It's not worth in ANY case. Have you seen the price of 6700k ? They want 400 EUR for this crap. I paid way less for i7 920 back in the day and it was enthusiast grade CPU/platform. Intel thinks their crap GPU is worth such premium? Heh... and it's not even hexacore. Total disappointment. I was expecting a price between 250-300 €, not freaking 400 €.
> 
> I'll just wait for AMD Zen as I originally planned. Or Skylake-E, hopefully without the stupid GPU. Got myself a new case to let components breathe a bit easier and that will be it.



I was thinking they were going to knock down Haswell and replace Skylake with that price. Seems like Intel's got a big head on it's shoulders to think that I would upgrade to something that performs worse than their last generation chips. My Sandy will suit me just fine until a worthy upgrade is in order.


----------



## Vayra86 (Aug 6, 2015)

As predicted, as expected, though I would take these benches with a grain of salt as well. A test on Tweakers.net showed an overall IPC increase of about 7-9%, which is business as usual.

Some other details that are lost here:
- overclocking is more accessible again because the voltage regulator is back on the motherboard
- you can tweak base clock again
- RAM can be tweaked further + DDR4

So as an overclocker Skylake really isn't all too bad. Other than that, move along, nothing to see here.

About AMD Zen. Really? I think it won't give us all that much over Intels offerings. The truth of the matter is, we are reaching a point in CPU performance where silicon doesn't get us much further, the same is happening on the GPU side of things. The big wins come from efficiency in software, stuff like better threading of DX12 for example. Which is great for us consumers, because we get more for less $$$.

Also given the fact that it's all about commerce, I think the days of big performance leaps are LONG gone. The only reason they existed back in the day is because things were horrendously slow. Not so today, the lower end of the performance curve can already do basic tasks with ease, so there is zero urgency to look for new leaps in performance. It is also economically a very bad idea to do so, much more profit in holding back development to make baby steps forward and have customers pay multiple times for the same stuff.

Reality sucks, right.


----------



## RejZoR (Aug 6, 2015)

What has changed regarding base clock on Intel CPU's after Nehalem? I could OC using BCLK and additional multiplier. Were CPU's after Nehalem all about multiplier only?


----------



## Vayra86 (Aug 6, 2015)

Yes, it was locked to 100 and any big deviation from it would result in major instability.


----------



## CrAsHnBuRnXp (Aug 6, 2015)

I saw yesterday on newegg an i5 6600K for $249.99 without a HSF and the i5 4690k is $239.99 WITH a HSF. And it can only be bought in a combo to boot currently.


----------



## Solaris17 (Aug 6, 2015)

within 100 points from some random japanese site. How exactly is the 4770 unfair compared to the 4790? you know other than clock they are the exact same chip right? like EXACTLY the same. I guess I can see how the clock difference can be a problem, I mean thats not a good control. but saying 100 points in hitman means a CPU is trash is kind of eating the same cake. Not to mention everyone knows the last 3 generations of intel CPUs have done nothing in terms of game performance when it comes to FPS. Or did I miss the memo? are we supposed to expect 20fps jumps now like we did with gen 1 I7 and P4 to Athlon orleans cores?


do these results really surprise people? every test stating ~5% performance improvements were done using CPU based tests. No one ever cl;aimed to get major performance gains in games. Unless you were using the integrated GPU on the chip which is apparently ~20% faster then previous gen.

Im kind of disappointed in TPU the sensationalism is real. What _did _you expect from a $300 price segment from a CPU that is a failed die shrink?

Are we only as smart as the power point slides we read now? Even that isnt the case since like I said none of this was ever stated or slated or expected from this series of CPUs.


----------



## RejZoR (Aug 6, 2015)

I wish they'd improve core count already. We have 4C 8T configuration for ages... Nehalem was what, 6-7 years ago and today, we have the exact same setup. Still. * sigh * One would expect 8C 16T by now...


----------



## Moofachuka (Aug 6, 2015)

was thinking of upgrading my i7 980 to Skylake for gaming..... guess i'll keep waiting...


----------



## Jetster (Aug 6, 2015)

Moofachuka said:


> was thinking of upgrading my i7 980 to Skylake for gaming..... guess i'll keep waiting...




It would be a worthy upgrade for you.


----------



## MxPhenom 216 (Aug 6, 2015)

Has everyone completely missed the fact at stock the 4790k boosts to 4.4ghz, where as 6700k does 4.2ghz?


----------



## MxPhenom 216 (Aug 6, 2015)

RejZoR said:


> I wish they'd improve core count already. We have 4C 8T configuration for ages... Nehalem was what, 6-7 years ago and today, we have the exact same setup. Still. * sigh * One would expect 8C 16T by now...



But Nehalem was on the enthusiast platform which had a 6 core chip or 2, which is current day X79/X99. Not mainstream z87/97/170.


----------



## Moofachuka (Aug 6, 2015)

Jetster said:


> It would be a worthy upgrade for you.


u sure?  how much performance increase would that be? Thx


----------



## MxPhenom 216 (Aug 6, 2015)

Moofachuka said:


> u sure?  how much performance increase would that be? Thx



~25% plus its an all new platform. DDR4, SATA 3 via chipset not 3rd party controller, USB 3.0, PCie 3.0, etc, sata express and m.2


----------



## tabascosauz (Aug 6, 2015)

I'm not sure what the big deal is. AT already stated very clearly that there are some hardware shenanigans going on that we don't understand yet. It's been said a million times in Ian's review that Intel is being very secretive and there's something going on under the hood when Skylake is beating Haswell in synthetics and losing in gaming.

Skylake is slower clock for clock at 3 GHz. It's stock clock is not 3 GHz. It does just fine at 4.2GHz. You forget that the 4790K comes in at 4.4GHz. You're not forced to upgrade to Skylake.

@RejZoR 400EUR is a lot to ask for, but a new product is going to be expensive and the price will come down slowly over time. Things are priced differently around the world.

The gaming crowd and MOAR CORES crowd is real. Intel CPUs are not selling to gamers exclusively; get over it. If HEDT platform was mainstream and we had 6-8 CORES, then Intel would walk all over AMD at this point, but they can kiss goodbye to a whole chunk of money too spent on extra development. Also, what would happen to the Intel = efficient argument? Mainstream CPUs @ 130W, sounds like AMD to me.

It's not like Skylake is supposed to be refined anyway. Broadwell was a half family. Skylake is the first mainstream DDR4 platform. Skylake is the first to have DMI 3.0. Now Intel has another shot at 14nm, which was a process that wasn't particularly kind to them, with Kaby Lake.

Contrary to what has been said, it is a good upgrade for AMD or Sandy users. What's with all the talk about losing x% in gaming? It's a tiny difference, and I thought you all were saying that the Piledriver CPUs gamed "just as well" as Haswell when they were losing far more that Skylake does. For users of 900-series AMD or 6-series boards, Skylake offers features that can't be found on older boards, that aren't part of the core itself.

Haswell was terrible at OC and people complained. Intel has addressed that problem with Skylake, which clocks higher and is not restricted by voltage, and people find other things to complain about. The thing we wanna know is what Skylake has under the hood that's been causing all this. If you're just coming to rant about there being no 8 cores, well, go buy AM3+ then. I'm sure you'll be happy with your "8 cores".


----------



## Vayra86 (Aug 6, 2015)

tabascosauz said:


> I'm not sure what the big deal is. AT already stated very clearly that there are some hardware shenanigans going on that we don't understand yet. It's been said a million times in Ian's review that Intel is being very secretive and there's something going on under the hood when Skylake is beating Haswell in synthetics and losing in gaming.
> 
> Skylake is slower clock for clock at 3 GHz. It's stock clock is not 3 GHz. It does just fine at 4.2GHz. You forget that the 4790K comes in at 4.4GHz. You're not forced to upgrade to Skylake.
> 
> ...



This, nuff said. We don't need eight or sixteen cores, and we definitely don't need those for gaming. History proves this. Games are being made for the most common denominator, as I have always said. Gamers are only a few percent of the PC market, people need to get real.


----------



## EarthDog (Aug 6, 2015)

One thing to note.. maybe two...

1. DX12 will allow the use for cores...whenever games start hitting the market.
2. 4.4Ghz on a 4790K is a single core, no? Doesn't Skylake boost all cores to 4.2GHz?



> which clocks higher and is not restricted by voltage


Sorry, I am seeing 4790K's all over the 4.5GHz+ range, with a rare few at 5GHz. Same with Skylake. If overclocking has improved, its not by much.

Also, I am not sure what you mean 'not restricted by voltage'. Haswell starts at what, 1.1v give or take, while Skylake starts at 1.3v give or take. They both end up around 1.45v or so when you start to get close to 5Ghz. Also, it has been told by MFG not to go over 1.42v because of degradation. Now, I don't know if that means it will degrade quick or what. But that is the same 'limit' that is commonly held by 'those in the know' not to go over for haswell. So what restrictions were lifted?


----------



## yogurt_21 (Aug 6, 2015)

EarthDog said:


> One thing to note.. maybe two...



2. only if your mobo sucks, intel spec was 4.4. single, I don't know anyone who has a board that doesn't turbo all cores to 4.4 though.
 but you are correct with overclocking 4.6 is all I can get stable without some extreme intervention whereas the 6700k is showing 4.8-5.1 in reviews.

but as I posted in the Z170 MSI board review this change isn't about the cpu. It's about the chipset and features. If they're not enough for the $ for you then wait for the next tock. Otherwise someone comparing a 2600K system to this and ignoring the motherboard is doing themselves a disservice. The cpu may not be faster, but everything else is.


----------



## Vayra86 (Aug 6, 2015)

yogurt_21 said:


> 2. only if your mobo sucks, intel spec was 4.4. single, I don't know anyone who has a board that doesn't turbo all cores to 4.4 though.
> but you are correct with overclocking 4.6 is all I can get stable without some extreme intervention whereas the 6700k is showing 4.8-5.1 in reviews.
> 
> but as I posted in the Z170 MSI board review this change isn't about the cpu. It's about the chipset and features. If they're not enough for the $ for you then wait for the next tock. Otherwise someone comparing a 2600K system to this and ignoring the motherboard is doing themselves a disservice. The cpu may not be faster, but everything else is.



But there is no real reason to upgrade a slow system for its motherboard features. If you are really looking for them, you are probably already an enthusiast or at least someone with a high end rig. Everyone else doesn't need DDR4 and the other most important consumer-grade part of the chipset would be SATA which is already gen 3 for a couple of years.


----------



## tabascosauz (Aug 6, 2015)

@EarthDog straight from Cutress' review

"Number two: Skylake is more thermally limited than voltage limited. In all the samples I had, as well as a number of results from other reviewers, the main barrier to overclocking was that at some point the thermal limits of the processor kicked in and it reduced both frequency and voltage to recover. With sufficient cooling, such as a water chiller, this means that Skylake could be a really nice overclocker. It also brings up the question of whether Intel is limiting the overclocking potential with sub-standard paste between the die and the heatspreader. We’ve been told (not by Intel) that removing the heatspreader is relatively easy, but from reports it seems that the heatspreader seems to be thinner than before and the paste is only slightly better than what Haswell came with."

Also, we may be going back to pre-SB style overclocking without the instability on the BCLK.


----------



## cadaveca (Aug 6, 2015)

Vayra86 said:


> But there is no real reason to upgrade a slow system for its motherboard features. If you are really looking for them, you are probably already an enthusiast or at least someone with a high end rig. Everyone else doesn't need DDR4 and the other most important consumer-grade part of the chipset would be SATA which is already gen 3 for a couple of years.


This. I mean, I have boards in my possession now. I have a CPU, which won't be on sale for a while.

The boards rock. CPU OC is simple... far easier than Haswell.

SO what sells these PCs? Power savings? or is it WIndows10 compatibility?

Personally, I am an overclocker. So the new platform... to me offers different overclocking, higher clock speeds, and better DDR4 efficiency. I mean, like X79 bandwidth, with just two channels.



tabascosauz said:


> @EarthDog straight from Cutress' review
> 
> "Number two: Skylake is more thermally limited than voltage limited. In all the samples I had, as well as a number of results from other reviewers, the main barrier to overclocking was that at some point the thermal limits of the processor kicked in and it reduced both frequency and voltage to recover. With sufficient cooling, such as a water chiller, this means that Skylake could be a really nice overclocker. It also brings up the question of whether Intel is limiting the overclocking potential with sub-standard paste between the die and the heatspreader. We’ve been told (not by Intel) that removing the heatspreader is relatively easy, but from reports it seems that the heatspreader seems to be thinner than before and the paste is only slightly better than what Haswell came with."
> 
> Also, we may be going back to pre-SB style overclocking without the instability on the BCLK.




Well, my Skylake is the opposite... it is voltage limited, not temperatures. I hit the max clock on a Corsair H90, with lots of thermal overhead, and no throttling. Also, please do pay attention to whether chips are ES or retail. But I didn't do a CPU review, so not all the info is out yet.


----------



## EarthDog (Aug 6, 2015)

Mine wasn't thermally limited either. I believe I only hit 70-72C at 4.9GHz 1.449v load...

I didn't push much further because the CPU isn't mine and I was told by a different MFG to not go over 1.42v because of degradation. 



tabascosauz said:


> Also, we may be going back to pre-SB style overclocking without the instability on the BCLK.


I dont see this happening with unlocked multi CPUs. The only point is flexibility in clocks (both memory and CPU). Performance gains are few and far between with BCLK increases from my limited testing so far...


----------



## Delta6326 (Aug 6, 2015)

Review from TomsHardware, the 6600k looks great, very close to 6700k.
I plan for 6700k, for future proofing for hopefully another 5years.
http://www.tomshardware.com/reviews/skylake-intel-core-i7-6700k-core-i5-6600k,4252.html


----------



## qubit (Aug 6, 2015)

Vayra86 said:


> Also given the fact that it's all about commerce, I think the days of big performance leaps are LONG gone. The only reason they existed back in the day is because things were horrendously slow. Not so today, the lower end of the performance curve can already do basic tasks with ease, so there is zero urgency to look for new leaps in performance. *It is also economically a very bad idea to do so, much more profit in holding back development to make baby steps forward and have customers pay multiple times for the same stuff.*
> 
> Reality sucks, right.


Exactly. It's the usual crap that you see when there's no competition.


----------



## thebluebumblebee (Aug 6, 2015)

RejZoR said:


> I was expecting a price between 250-300 €, not freaking 400 €.


Why?
CPU/introduced/price ($)
i7-920/Nov.2008/284
i7-860/Sept.2009/284
i7-2600K/Lan.2011/317 (2600 @ $294)
i7-3770/April 2012/342 (3770 @ $305)
i7-4770K/June 2013/350 (4770 @ $312) same for 4790/4790K
i7-5775C/June 2015/377
i7-6700K/Aug 2015/Same as the 4770/90K?(but without a cooler)

Info from cpu-world


----------



## RejZoR (Aug 6, 2015)

Mostly because it's not all that ground breaking. It's basically Nehalem with few more instructions, slightly better instructions per clock and stupid GPU attached to it made at 16nm. Everything else is pretty much identical. Hell, my old clunker can run PCIe 16x in any slot with any amount of cards. This one instantly gets chopped to 8x/8x. It's just sad and they charge way too much for what it is. And I think it's mostly just because of the GPU. Without it, it would cost as much as I said. Most likely.


----------



## tabascosauz (Aug 6, 2015)

RejZoR said:


> Mostly because it's not all that ground breaking. It's basically Nehalem with few more instructions, slightly better instructions per clock and stupid GPU attached to it made at 16nm. Everything else is pretty much identical. Hell, my old clunker can run PCIe 16x in any slot with any amount of cards. This one instantly gets chopped to 8x/8x. It's just sad and they charge way too much for what it is. And I think it's mostly just because of the GPU. Without it, it would cost as much as I said. Most likely.



OEMs, man. OEMs love to sell PCs with "powerful" quad-core processors because most people think that the CPU will let you play Crysis at 4K. Without an iGPU, they can't sell PCs like that.


----------



## RejZoR (Aug 6, 2015)

OEM's love(d) to sell GeForce 2 with 8GB of VRAM. because people think gazillion gigabytes of VRAM makes allt he difference...


----------



## EarthDog (Aug 6, 2015)

They still do...


----------



## tabascosauz (Aug 6, 2015)

We can't really blame them, can we? Obviously Intel did a lot of shady things to get all those OEMs, but aside from enterprise, they're still the ones keeping Intel afloat, right? Look what happened to AMD, unable to get companies to put their CPUs in their PCs.

Maybe Intel will decide to have a line of SKUs or just one SKU without iGPU enabled, like the 3350P or 1230V3. Hopefully. Well, probably not.


----------



## EarthDog (Aug 6, 2015)

I may be jumping into the conversation late, but... the X79/X99 platform doesn't have iGPUs... close enough?


----------



## tabascosauz (Aug 6, 2015)

EarthDog said:


> I may be jumping into the conversation late, but... the X79/X99 platform doesn't have iGPUs... close enough?



Yeah, but these are mainstream CPUs. The i5-3350P was great up until Haswell had no equivalent. E3 was great up until Broadwell when Intel threw it all away. But maybe Broadwell's E3s will prove to be just an aberration like its i5 and i7 and Skylake will be good again.


----------



## lilhasselhoffer (Aug 6, 2015)

1) None of the enthusiast level products from Intel focus on the iGPU.  
2) We've had quad core processors for nearly a decade.  Core2 Quads released in July 2007.
3) Most games and programs still don't give a crap about core count on processors, they run off of frequency.


Intel has had a consistent 5-7% performance increase since SB.  SB was unique not because of the development team, so much as manufacturing.  Thermally limited processors were a serious concern, so SB had a soldered lid.  SB ran insanely cool, so it became a boon for overclocking (the changes to overclocking processes were also instrumental in SB).  SB came out with enough of everything for most people (SATA, USB, etc...).  SB functionally killed Core2 and Nehalem by simply being an amazing processor.

Unfortunately, the success of SB is a double edged sword.  Performance leaps were paid for with rapidly advancing manufacturing techniques.  That's great, until a shrinking process yields greater costs that can't be offset by yield.  Likewise, SB heralded the integrated GPU.  It was originally a way to compete with the APU, but it's become an obsession for Intel.  They segment off more and more of their resources into developing a CPU, despite not being able to use modern GPU designs due to licensing.  I'm of the opinion that SB's stellar success has put us into a position where the only way we'll ever see such huge generational gains again would be for a generation to resoundingly fail.  I'm pretty sure Intel saw this with Broadwell, and it's why the Haswell refresh occurred.  

There's too much saturation of the market with product that doesn't have a marked difference.  Say what you want, but most people could give a crap less about 1-3 FPS.  They only care that the game is playable, and aren't willing to spend thousands of dollars to get those extra few frames.  Intel knows this, and they've made sure that their development keeps just far enough ahead of the unacceptable line to allow each generation to sell.  Personally, I hope Skylake tanks and it's a turd.  That would spur another substantial leap, and get us out of the huge hole that SB left behind.


----------



## awesomesauce (Aug 6, 2015)

dont worry, they sure gonna releash a 6770k 6790k


----------



## tabascosauz (Aug 6, 2015)

lilhasselhoffer said:


> Personally, I hope Skylake tanks and it's a turd.  That would spur another substantial leap, and get us out of the huge hole that SB left behind.



But will Intel really be able to spawn an amazing successor? Is it really because Intel is employing its just-enough business strategy, or is Intel running into genuine difficulties? Things at Intel were extremely relaxed for Ivy and Haswell, but his time around it seems different from the usual comforting tick-tock; it feels as if Intel is wrestling with 14nm and just can't get around the problems it's having. After all, isn't Kaby Lake just an indicator of this? Expecting an easy shrink and easy money from 14nm, Intel failed with Broadwell and its bizarre lineup (and we laughed at AMD for making Carrizo mobile-only, Broadwell was no better), then seems to have failed again with Skylake, though perhaps not in such a spectacular fashion. Now Kaby Lake seems to be their next attempt at mastering this node.


----------



## cadaveca (Aug 6, 2015)

tabascosauz said:


> But will Intel really be able to spawn an amazing successor? Is it really because Intel is employing its just-enough business strategy, or is Intel running into genuine difficulties? Things at Intel were extremely relaxed for Ivy and Haswell, but his time around it seems different from the usual comforting tick-tock; it feels as if Intel is wrestling with 14nm and just can't get around the problems it's having. After all, isn't Kaby Lake just an indicator of this? Expecting an easy shrink and easy money from 14nm, Intel failed with Broadwell and its bizarre lineup (and we laughed at AMD for making Carrizo mobile-only, Broadwell was no better), then seems to have failed again with Skylake, though perhaps not in such a spectacular fashion. Now Kaby Lake seems to be their next attempt at mastering this node.


The one factor left that many do not consider is software.

We need compeling software that uses more than 4 cores before Intel will release such devices into the consumer market.

Hopefully Win10 is the start into that transition, but in the end, the way these company design these devices is based on meeting consumer needs, and they make billions doing so.


----------



## lilhasselhoffer (Aug 6, 2015)

tabascosauz said:


> But will Intel really be able to spawn an amazing successor? Is it really because Intel is employing its just-enough business strategy, or is Intel running into genuine difficulties? Things at Intel were extremely relaxed for Ivy and Haswell, but his time around it seems different from the usual comforting tick-tock; it feels as if Intel is wrestling with 14nm and just can't get around the problems it's having. After all, isn't Kaby Lake just an indicator of this? Expecting an easy shrink and easy money from 14nm, Intel failed with Broadwell and its bizarre lineup (and we laughed at AMD for making Carrizo mobile-only, Broadwell was no better), then seems to have failed again with Skylake, though perhaps not in such a spectacular fashion. Now Kaby Lake seems to be their next attempt at mastering this node.



Do you assume incompetence on the part of Intel, or do you assume that Intel is gating their output to meet consumer demands?  Intel isn't incompetent, as despite their propensity for back room deals they are still the uncontested best manufacturer of PC CPUs.  That only leaves a managed demand.  Let's tear into that:
1) Does the main consumer of CPUs require more components integrated onto the CPU?  Yep, queue up the iGPU.
2) Does a minor performance increase per generation still allow sales?  Yep, the early adopters buy it, and the users of old systems eventually get to the point where a replacement is required anyways.
3) How does Intel structure the pricing on their items?  Three markets (consumer, enthusiast, server) all have their needs directly catered to, and there's little overlap.  Where overlap occurs pricing between offerings is comparable.



I can't really see why Intel wouldn't manage their growth to the market.  That 20 million dollar investment that yielded an 7% performance increase could carry them over for two generations of CPUs, factoring in die shrink efficiency.  Why would you give the consumers a 9% generational increase if there's no competition and you can cut R&D costs.  Heck, even if the dark horse arises you could include some banked performance improvements, and offer a generation of CPUs that would blow your competitors out of the water.  This is the definition of a good business practice, even if we as consumers hate it.  That hate is hard to weaponize though, but a generation of CPUs that sucked could galvanize that response.  The reason I hope Skylake is a turd isn't to be contrarian, only that I hope it brings an otherwise unaware populace a very whole understanding of Intel getting away with this.    




cadaveca said:


> The one factor left that many do not consider is software.
> 
> We need compeling software that uses more than 4 cores before Intel will release such devices into the consumer market.
> 
> Hopefully Win10 is the start into that transition, but in the end, the way these company design these devices is based on meeting consumer needs, and they make billions doing so.



I believe I said something to that effect; in 8 years the benefits of quad core processors (though to be fair, threading on Core processors was pretty terrible) still aren't being realized.


----------



## tabascosauz (Aug 6, 2015)

Best to let sleeping dogs lie, I guess...? The reason that most people don't realize that Intel's business practices are exemplary is because they equate making money with evil. I don't think that they need to know the details of how good of an insurance policy is provided to Intel by its profits and how Intel has undoubtedly accumulated that wealth by restricting the progression of CPU technology.

What I meant was that perhaps Intel is feeling the pinch. They obviously have a lot of safety nets, if you will.

What @cadaveca makes a lot more sense than wanting 8 or 16 cores. Be it on Piledriver or Haswell-E, having 8 cores of any kind doesn't make sense if games / software doesn't give a singular crap about using them. It's good that Windows 10 is showing a bit of promise, however small, that CPU usage can be improved even without redoing a game for DX12.


----------



## Vayra86 (Aug 7, 2015)

qubit said:


> Exactly. It's the usual crap that you see when there's no competition.



This also happens on other, very competitive markets. And I think it is a misjudgment to think there is no competition for Intel. They have their problems ahead of them, AMD iis having its problems today.


----------



## Solaris17 (Aug 7, 2015)

thebluebumblebee said:


> Why?
> CPU/introduced/price ($)
> i7-920/Nov.2008/284
> i7-860/Sept.2009/284
> ...



lol that kinda seems off. I pre-orderd my 920 for $1k


----------



## EarthDog (Aug 7, 2015)

Solaris17 said:


> lol that kinda seems off. I pre-orderd my 920 for $1k


Lol..you got taken to the cleaners dude. Wow.


----------



## haswrong (Aug 7, 2015)

tabascosauz said:


> Yeah, but these are mainstream CPUs. The i5-3350P was great up until Haswell had no equivalent. E3 was great up until Broadwell when Intel threw it all away. But maybe Broadwell's E3s will prove to be just an aberration like its i5 and i7 and Skylake will be good again.


3350p should have had vt-d enabled, but hwinfo64 says the feature is supported but disabled. cant make it enable..

edit: aargh, ive found its not supported by motherboard.. looks like i need to ditch the msi board in favor of a much much better one. damn msi.


----------



## Solaris17 (Aug 7, 2015)

EarthDog said:


> Lol..you got taken to the cleaners dude. Wow.



WOOP! your right. I just logged into my mega old provantage account and it was $320! I only remembered the grand total of the order! 1017.23


----------



## Frag_Maniac (Aug 7, 2015)

Solaris17 said:


> How exactly is the 4770 unfair compared to the 4790? you know other than clock they are the exact same chip right?



Not entirely true. The 4770 had notorious heat problems and they found out it was due to poor thermal transfer between the chip and heat spreader. Fixing that (mostly with better TIM application), was essentially what allowed it to be clocked much higher. Hence the 4790 was born.


----------



## Frag_Maniac (Aug 7, 2015)

RejZoR said:


> It's not worth in ANY case. Have you seen the price of 6700k ? They want 400 EUR for this crap.


Let's not let one pre market test and Euro price estimates skew the facts here.

This puts it in more perspective.

_"The flagship SKU, Intel Core i7 6700K will retail for $313 which is actually pretty cheap, considering the previous flagship MSRPs of $339 USD and $377 USD for the Core i7-4790K and Core i7 5775C respectively."_

Source: http://wccftech.com/intel-skylake-s-lineup-list-price-i7-6700k-313-i5-6600k-225/


----------



## RejZoR (Aug 7, 2015)

World doesn't revolve around USA you know...


----------



## Vlada011 (Aug 7, 2015)

Tragedy! That's  reason why I escape on independent platform.
When Skylake E successor show up, my platform will be 3-4 years old.
People with X58 didn't miss nothing, people with X58 revision USB 3.0/SATA III could freely to cross on X99 and that's upgrade, improvments on all field, not upgrade because chipset evolve from PCI-E 2.0 to PCIE-3.0 when even graphic card on x8 Gen 2 give same performance as x16 Gen 3...
Here with X99 you can look madness from distance, new chipset every 2 year, new socket every 4+ years.
And many people will buy Maximus 8 and i7-6700K that will cost who knows how much and they could build now X99 for 650$ in USA only if they look better discounts, everything, processor 6 core, flux solder, quad channel, motherboard with all features, USB 3.1 32Gb/s M.2 transfer 2-3 PCIE x16 slots, and 16GB quad channel 2800MHz...
Off course if someone buy now processor he will buy Skylake, not Haswell but fact that same performance are more than year on market are tragedy.
Intel know that people will upgrade only because DDR4 even close one eyes for 2-3% weaker platform. That's same performance as Haswell obvious, only clock decide.
Same will happen with Skylake E, processor will be stronger if Intel launch something with 3.8GHz stock clock. OK people will have less space for OC, but that's not important to them. We should concentrate on real improvements... every GeForce successor is almost 80% improvements, TITAN X work as TITAN Black SLI...


----------



## Frag_Maniac (Aug 7, 2015)

RejZoR said:


> World doesn't revolve around USA you know...



LOL, I should be the one saying Europe isn't the best reference point for Intel pricing, but then I already did didn't I?

UK have high prices on some things, Aussies get hit even worse. The difference is, I don't see them using it as an excuse to say the product itself is "crap".


----------



## AsRock (Aug 7, 2015)

Jetster said:


> Yep. So really not worth any upgrade unless from AMD or Sandy



Not worth it still even on SB as that is 100% able to run your games  @ good frame rates to begin with.


----------



## Aquinus (Aug 7, 2015)

tabascosauz said:


> Also, we may be going back to pre-SB style overclocking without the instability on the BCLK.


No so long as the PCI-E clock is tied directly to the BCLK. That's really one of the only reasons why you can't overclock it very well because the PCI-E bus gets super flaky after exceeding a 4-5Mhz overclock from the 100Mhz base.

Also, my 3820 is still perfectly adequate. I suspect it will be for some time to come.


----------



## RejZoR (Aug 7, 2015)

I don't get it why it can't be entirely independent?! If AGP/PCI could be entirely separated from the FSB clock back in the days, why can't be PCIe?


----------



## EarthDog (Aug 7, 2015)

Aquinus said:


> No so long as the PCI-E clock is tied directly to the BCLK. That's really one of the only reasons why you can't overclock it very well because the PCI-E bus gets super flaky after exceeding a 4-5Mhz overclock from the 100Mhz base.
> 
> Also, my 3820 is still perfectly adequate. I suspect it will be for some time to come.


The PCIe and DMI are separated in Skylake. I have screenshots of 225 BCLK in my review. No straps, no nothing. Just pure, unadulterated, 1Mhz at a time, BCLK... like the old days.
http://www.overclockers.com/intel-skylake-i7-6700k-cpu-review/



RejZoR said:


> World doesn't revolve around USA you know...


Nor does it around your part of the world? We are talking pricing here, not world peace...


----------



## blabla21 (Aug 7, 2015)

I presume upgrading from my i5 2500K @ 4.5 GHz is a no right ?!


----------



## EarthDog (Aug 7, 2015)

Its a yes to me, and to Anand... its a 25% gain between the two clock for clock... that said, its up to you as the 2500K is plenty serviceable, particularly when overclocked.


----------



## AsRock (Aug 7, 2015)

blabla21 said:


> I presume upgrading from my i5 2500K @ 4.5 GHz is a no right ?!



I would not bother, a little while ago i did a few tests with my 290X and the 2500k and it ran games perfectly fine.

Save your money



thebluebumblebee said:


> Why?
> CPU/introduced/price ($)
> i7-920/Nov.2008/284
> i7-860/Sept.2009/284
> ...



Dam bumping those prices up, i thought they were creeping up.


----------



## blabla21 (Aug 7, 2015)

Ofcourse in CPU intensive programs I would benefit from Skylake but Im wondering since most games I play are GPU based im wondering is that 10% even worth it .


----------



## AsRock (Aug 7, 2015)

nope, even more so for what it's going to cost.


----------



## EarthDog (Aug 7, 2015)

blabla21 said:


> Ofcourse in CPU intensive programs I would benefit from Skylake but Im wondering since most games I play are GPU based im wondering is that 10% even worth it .


Have you read any reviews??? Read them. Educate yourself.


----------



## blabla21 (Aug 7, 2015)

EarthDog said:


> Have you read any reviews??? Read them. Educate yourself.



I did , but most reviews are where i5 2500K is at stock and even then only 20% , while OC-ed to 4.5GHz I presume it would be to 10-15% Max increase in games (which is like 10 FPS ?), so I think I will stay with my SB until Canonlake.


----------



## EarthDog (Aug 7, 2015)

Cant you overclock Skylake??? Yes you can...So that gap would still remain the same.


----------



## blabla21 (Aug 7, 2015)

EarthDog said:


> Cant you overclock Skylake??? Yes you can...So that gap would still remain the same.



Yes and that is good for Skylake, but the problem is for someone like me who only games and does barely (if any) video rendering or Photoshop, the price of new CPU + new  Mobo + DDR4 RAM (which mind you is 20+% expensive in my country) would not be worth it , thats why I said Im staying with this chip, its really good.


----------



## EarthDog (Aug 7, 2015)

Indeed. Don't forget though, you can sell CPU/Mobo/Ram which will offset some of that cost.


----------



## Jetster (Aug 7, 2015)

SB is 5 years old. Upgrade if you can afford it.


----------



## Aquinus (Aug 7, 2015)

RejZoR said:


> I don't get it why it can't be entirely independent?! If AGP/PCI could be entirely separated from the FSB clock back in the days, why can't be PCIe?


You answered your own question if you knew more about computers worked. Back is the FSB days, the memory controller, PCI(-E), and AGP buses were off the chipset, not the CPU. Therefore it is entirely possible that the motherboard had its own clockgen for the buses. It's the way it is now because it takes less hardware because you're using a clock signal that's already there. Why add more hardware to the motherboard when we're trying to minimize it. Also the gain would only be for enthusiasts which we all know isn't where the money in the market is at.


Jetster said:


> SB is 5 years old. Upgrade if you can afford it.


It's also still perfectly adequate in most cases. I wouldn't upgrade unless the OP is actually hitting a bottleneck on it. Remember that resolution scaling review a while back with different CPUs? At higher resolutions the kind of CPU starts to factor out of the equation because the bottleneck is the GPU. Just food for thought.


----------



## RejZoR (Aug 7, 2015)

And who says the clocks have to be synchronous across the CPU? They could be async...


----------



## Aquinus (Aug 7, 2015)

RejZoR said:


> And who says the clocks have to be synchronous across the CPU? They could be async...


You would need an independent clock gen which takes die space or motherboard space. Do you fancy yourself an electrical engineer?

Maybe you should go give Intel some advice on how to make CPUs...


----------



## EarthDog (Aug 7, 2015)

Aquinas...

http://www.techpowerup.com/forums/t...i7-4790k-in-gaming.215004/page-3#post-3327156


> The PCIe and DMI are separated in Skylake. I have screenshots of 225 BCLK in my review. No straps, no nothing. Just pure, unadulterated, 1Mhz at a time, BCLK... like the old days.
> http://www.overclockers.com/intel-skylake-i7-6700k-cpu-review/



Assuming you are talking about Skylake of course. If not, continue to ignore me.


----------



## Aquinus (Aug 7, 2015)

EarthDog said:


> Aquinas...
> 
> http://www.techpowerup.com/forums/t...i7-4790k-in-gaming.215004/page-3#post-3327156
> 
> ...


I was not talking about Skylake. I was talking about every CPU since the PCI-E root complex was moved to the CPU. The first step this direction was supporting BCLK straps which was really different ratios of core : pcie and core:dmi. Clearly it was worth it but in the past it didn't seem to be. Good to know that they've been decoupled though.

Edit: He said they've been decoupled but I don't see anything about how far he got with overclocking on it. All the screenshots seem to be with the bclk at 100Mhz.


----------



## EarthDog (Aug 7, 2015)

Sorry, my fault.. skylake thread... context i thought was skylake too.. OOPs!

As I said, mine was up to 225MHz in my review of the MSI M7 board. I hit 250 BCLK but... wasn't very stable. The link was to the CPU review that states it...

Here is my review. SS towards the bottom in the Overclocking section.
http://www.overclockers.com/msi-z170a-gaming-m7-motherboard-review/


----------



## haswrong (Aug 7, 2015)

EarthDog said:


> Sorry, my fault.. skylake thread... context i thought was skylake too.. OOPs!
> 
> As I said, mine was up to 225MHz in my review of the MSI M7 board. I hit 250 BCLK but... wasn't very stable. The link was to the CPU review that states it...
> 
> ...


nice one. ummm, can you run graphics card at full 16x pcie speed + a 4x speed ssd at once? or is it able to perform only 8x 8x 4x?


----------



## EarthDog (Aug 7, 2015)

It depends on the board, where the PCIe lanes are coming from, and if it has a PLX chip. All of those are variables to answer that question.


----------



## bug (Aug 7, 2015)

blabla21 said:


> I presume upgrading from my i5 2500K @ 4.5 GHz is a no right ?!



I'm going to upgrade this time. The 2500k and 6600k seem about equal. But if you account for the fact that the iGPU has been seriously beefed up within the same TDP since Sandy Bridge, logic dictates the CPU uses much less power. And I'm not using the iGPU. Plus, there are the additional goodies like PCIe 3.0 all over the place, M2 slots (no more cables for SSDs).
Of course, the above are things I care about, it's not mandatory they're important to you, too.


----------



## bug (Aug 7, 2015)

haswrong said:


> nice one. ummm, can you run graphics card at full 16x pcie speed + a 4x speed ssd at once? or is it able to perform only 8x 8x 4x?



You can. There are 16 PCIe 3.0 lanes linked directly to the CPU, but there are additional lanes connected to the southbridge and you can run a ton of peripherals off those lanes. See: http://images.anandtech.com/doci/9483/Z170 Platform.jpg

Edit: Note that the southbridge is linked to the CPU through DMI 3.0 (essentially a PCIe 3.0 x4 bus), thus potentially bottlenecking reads from the added peripherals. It can be an issue if your video card tries to read large textures from a stripped RAID array, but otherwise you don't transfer that much data over DMI. And even with that bottleneck in place, there aren't any systems out there that are faster in this scenario.


----------



## Jetster (Aug 7, 2015)

First there not  equal
http://www.cpu-monkey.com/en/compare_cpu-intel_core_i5_2500k-5-vs-intel_core_i5_6600k-521

And second that's not how pci-e lanes work. Its a max of 16 so no on the 8X8X4


----------



## Vayra86 (Aug 7, 2015)

Jetster said:


> First there not  equal
> http://www.cpu-monkey.com/en/compare_cpu-intel_core_i5_2500k-5-vs-intel_core_i5_6600k-521
> 
> And second that's not how pci-e lanes work. Its a max of 16 so no on the 8X8X4



Correct but Skylake also has separate lanes through the DMI interface now, which is pretty interesting for storage alongside some beefy GPU. Theoretically that interface could even double as a sort-of PLX solution.


----------



## haswrong (Aug 7, 2015)

EarthDog said:


> It depends on the board, where the PCIe lanes are coming from, and if it has a PLX chip. All of those are variables to answer that question.


im sorry, guys, i let myself carried away by false information. i previously read tha skylake was supposed to have *20 pci-e lanes* available! according to intel ark database its *not the case at all*!

with that in mind now, im taking back my alarming question.

only to raise another one: is skylake just a haswell re-refresh?


----------



## EarthDog (Aug 7, 2015)

haswrong said:


> only to raise another one: is skylake just a haswell re-refresh?


no.


----------



## RejZoR (Aug 7, 2015)

I fell in love with AsRock Z170 Extreme 7+. What a pretty board.


----------



## haswrong (Aug 8, 2015)

EarthDog said:


> no.


well, thats good, because i was about to think intel is trying to pull our leg..



RejZoR said:


> I fell in love with AsRock Z170 Extreme 7+. What a pretty board.


absolutely! (but biostar z170x aint bad too)


----------



## TheHunter (Aug 8, 2015)

Frag Maniac said:


> LOL, I should be the one saying Europe isn't the best reference point for Intel pricing, but then I already did didn't I?
> 
> UK have high prices on some things, Aussies get hit even worse. The difference is, I don't see them using it as an excuse to say the product itself is "crap".


Fun fact,

I bought my 4770K @ release back in 2013 and I payed ~295€ for it (actually only ~ 250€ got it with no VAT), now 2 years later @ same store 4770K costs frickin 355€, i7 6700K is 380€ there.


----------



## Frag_Maniac (Aug 8, 2015)

TheHunter said:


> Fun fact,
> 
> I bought my 4770K @ release back in 2013 and I payed ~295€ for it (actually only ~ 250€ got it with no VAT), now 2 years later @ same store 4770K costs frickin 355€, i7 6700K is 380€ there.




The 4770k was a lemon from the get go due to poor heat transfer between chip and heat spreader like I said. Word soon got out and  they made a running change with TIM and it was re-released as the 4790k. At that point no one in their right mind would even considered buying a 4770k.

You really can't use old product as a comparison because quantities can affect pricing. It can be same with new products in some places depending on availability of not only the CPUs, but matching MBs.

The real point though is pricing is not always a legit way of defining product value, esp if you live in an area where pricing is crazy like that.


----------



## TheHunter (Aug 9, 2015)

These prices are from one German store - one of cheaper stores. 

Well yea lemon, some are some not.. Mine acts the same as 4790K at same freq. - heat wise., Although I got a good low volts chip.

Ps my point was Intel seem to be raising prices up instead of lowering, at least in EU. 
290€ vs 355€ 2years later is a joke.

Same thing with 5820K at its release ~ 355€, now min 395€..


----------



## Vayra86 (Aug 10, 2015)

TheHunter said:


> These prices are from one German store - one of cheaper stores.
> 
> Well yea lemon, some are some not.. Mine acts the same as 4790K at same freq. - heat wise., Although I got a good low volts chip.
> 
> ...



This is probably (also) related to the value drop of Euro versus Dollar.


----------



## Makaveli (Aug 14, 2015)

Solaris17 said:


> lol that kinda seems off. I pre-orderd my 920 for $1k



lol WTF

was it made with diamonds and gold pins?

Even the W3520 which is the Xeon version of the 920 was not that expensive at launch!!


----------



## Zoinho (Oct 15, 2015)

4790k stock Beats 6700k OC !!! Compare (youtu.be/zKwKOf0WZfw?t=27s) vs (youtu.be/e6axdpjyZzA?t=27s) ...


----------



## Vlada011 (Oct 15, 2015)

Intel's worse processor last several years, I will say from i7-870 until today is i7-4770K.
Best overclocking CPU is i7-2600K
Best overclocking CPU + new features is i7-3770K
CPU with fastest and nicest specifications and Turbo is i7-4790K.
Best CPU+ Platform is i7-6700K + Z170
Best Performance for money maybe ever and stronger than all of them is i7-5820K.


As you see all of them are nice depend how you look except i7-4770K.
Owners of this model no reason to be angry and could nice sell that CPU, he is excellent if someone don't want to OC only want specifications and i7 for little cheaper price.
i7-2600K is good CPU, but i7-3770K is only little weaker for OC and hotter but together with native Intel controllers and faster memory again is better shoping.
I can see in my head jealous people who give worse advice to other customers and explain that i7-4790K is better for gaming, faster and better shooping... That's nonsense and many people made crucial mistake before 4-5 months with fear from DDR4 price and now they could find 32GB DDR4 for same price as 16GB in March/April. Same speed, same latency, same kit, excellent class only bigger... But they missed chance to upgrade on best platform after DDR3 launch. i7-4790K is maybe better but only out of box. On same frequency they are same or i7-6700K is better, special in multi applications. Closest to i7-5820K, and 90% customers of K Processors will keep both of them on 4.5-4.6 or 4.8GHz and than fact that i7-4790K is faster for gaming on fabric specifications is not important any more.
People who bought i7-4790K after Intel present this nice platform Z170 made serious mistake.
I would sold Z97 platform to someone who have AMD for cheaper price and enjoy in DDR4 and M.2 devices even if they need x8 speed and Gen 3 to offer best performance. Much better platform.


----------



## peche (Oct 15, 2015)

t


Vlada011 said:


> Best overclocking CPU + new features is i7-3770K


that explains a lot why those chips still on high prices.... 

Regards,


----------



## EarthDog (Oct 15, 2015)

What, one person's opinion on a forum? Good call peche...


----------



## Zoinho (Oct 15, 2015)

EarthDog said:


> What, one person's opinion on a forum? Good call peche...


This is not an opinion, this is an fact. For game, 6700k lose to 4790k.


----------



## tabascosauz (Oct 15, 2015)

Zoinho said:


> This is not an opinion, this is an fact. For game, 6700k lose to 4790k.



So what do you propose? People looking to settle on a LGA115x i7 should buy the 4790K?

What does this change? What can we do other than wait for Kaby Lake and Zen? You want people to go backwards and say "no thanks, I don't want DDR4; I want to have DDR3 for the minute gains I get from the 4790K in gaming"? Get real.

I have the 4790K and 16GB of DDR3; I wouldn't ditch it for a 6700K anytime soon even if the 6700K did offer a small advantage over the 4790K in gaming. For someone looking to get a new high-end CPU and motherboard without going X99, it's still the smarter choice to hop on Skylake as I don't see any kind of price savings with buying LGA1150.


----------



## Zoinho (Oct 15, 2015)

tabascosauz said:


> So what do you propose? People looking to settle on a LGA115x i7 should buy the 4790K?
> 
> What does this change? What can we do other than wait for Kaby Lake and Zen? You want people to go backwards and say "no thanks, I don't want DDR4; I want to have DDR3 for the minute gains I get from the 4790K in gaming"? Get real.
> 
> I have the 4790K and 16GB of DDR3; I wouldn't ditch it for a 6700K anytime soon even if the 6700K did offer a small advantage over the 4790K in gaming. For someone looking to get a new high-end CPU and motherboard without going X99, it's still the smarter choice to hop on Skylake as I don't see any kind of price savings with buying LGA1150.



Man, ddr4 up 3.0Ghz vs ddr3 at 1.6ghz is no difference for gaming. Compare 6700k with DDR4 3GHZ vs 4790k with ddr3 at 1.6ghz youtu.be/zKwKOf0WZfw?t=27s and Look This youtu.be/7ECkgwnaaAg?t=1m20s


----------



## tabascosauz (Oct 15, 2015)

Zoinho said:


> Man, ddr4 up 3.0Ghz vs ddr3 at 1.6ghz is no difference for gaming. Compare 6700k with DDR4 3GHZ vs 4790k with ddr3 at 1.6ghz youtu.be/zKwKOf0WZfw?t=27s



Oh look, a troll. I see how it is.






EDIT: Don't go back on your word please. I want to see how you got a magical 4.8GHz out of that original, *unedited* post. I want to see how exactly you intended to explain this one. 4.8GHz? Not how I see it.


----------



## FordGT90Concept (Oct 15, 2015)

That's more of a GPU test than a CPU test.  CPU differences are best shown at low resolution and low settings.


----------



## phanbuey (Oct 15, 2015)

FordGT90Concept said:


> That's more of a GPU test than a CPU test.  CPU differences are best shown at low resolution and low settings.


Definitely true - I think it just shows the immaturity of the platform / drivers if anything.


----------



## R-T-B (Oct 15, 2015)

Zoinho said:


> This is not an opinion, this is an fact. For game, 6700k lose to 4790k.


Not with a simple bios adjustment almost all new mono BIOSes have.  Google "1ghz fclk"


----------



## Zoinho (Oct 15, 2015)

tabascosauz said:


> Oh look, a troll. I see how it is.
> 
> View attachment 68551
> 
> ...


What? This is the video with 6700k OC 4.8Ghz vs 4790k stock youtu.be/e6axdpjyZzA?t=27s . And this is the 6700k stock vs 4790k stock youtu.be/zKwKOf0WZfw?t=27s


----------



## EarthDog (Oct 15, 2015)

Zoinho said:


> This is not an opinion, this is an fact. For game, 6700k lose to 4790k.


nice... wasn't even talking about that though. I was giving peche a hard time (even said his name, lol).

Anyway, have fun!


----------



## peche (Oct 15, 2015)

EarthDog said:


> nice... wasn't even talking about that though. I was giving peche a hard time (even said his name, lol).
> 
> Anyway, have fun!


----------



## Zoinho (Oct 16, 2015)

peche said:


>





EarthDog said:


> nice... wasn't even talking about that though. I was giving peche a hard time (even said his name, lol). Anyway, have fun!



Lol... Now I understood. Sorry... I thought you were talking about the title of the article. But it resulted in a good discussion.



FordGT90Concept said:


> That's more of a GPU test than a CPU test.  CPU differences are best shown at low resolution and low settings.





phanbuey said:


> Definitely true - I think it just shows the immaturity of the platform / drivers if anything.



There is no difference in high resolution? Then Why buy cpu faster?
in Many Real Benchmarks 4790k wins.
Have 15 benchmarks: youtube.com/watch?v=NoFzi3WdNZo&list=PLC85l4CwqZZBGSYnXRv0d3nUzKP70CXMA


----------



## Shmoo (Oct 16, 2015)

I have i7-6700K + EVGA GTX 980 Ti Classified and get some pretty amazing benchmarks at stock clocks. Will post any bench you would like to see upon request. I'm trying to wait until I get all the gear for my CPU + GPU custom liquid cooling loop before I really cut loose overclocking. Only thing I am frustrated about is that I ended up with two i7-6700K's. I auctioned one off on eBay and the winner did not pay. So...I am stuck on BenQ 1080p 144Hz until I can sell the extra one or get a refund. Then I am planning on BenQ 1440p 5ms IPS. When Pascal hits I'll probably do 2160p.


----------



## R-T-B (Oct 16, 2015)

Shmoo said:


> I have i7-6700K + EVGA GTX 980 Ti Classified and get some pretty amazing benchmarks at stock clocks. Will post any bench you would like to see upon request. I'm trying to wait until I get all the gear for my CPU + GPU custom liquid cooling loop before I really cut loose overclocking. Only thing I am frustrated about is that I ended up with two i7-6700K's. I auctioned one off on eBay and the winner did not pay. So...I am stuck on BenQ 1080p 144Hz until I can sell the extra one or get a refund. Then I am planning on BenQ 1440p 5ms IPS. When Pascal hits I'll probably do 2160p.



My 6700k will kick most 4790k systems to the curb as well.  If I turn the FCLK down to 800Mhz forcefully (my boards bios lets me do this) it starts to slow down in FPS slightly, but either way not much to worry about.

Nearly all boards now have a 1GHZ FCLK as the default, IIRC.


----------



## FordGT90Concept (Oct 16, 2015)

Zoinho said:


> Have 15 benchmarks: youtube.com/watch?v=NoFzi3WdNZo&list=PLC85l4CwqZZBGSYnXRv0d3nUzKP70CXMA


You know all of these videos you're posting are useless.  They're showing realtime framerate at very high settings which says a lot of nothing.  All that any of us are going to care about is the range (min, max) and average.


Only old games really need a faster CPU where they're limited to one or two cores.  Newer games are far more limited by the GPU.


----------



## Frag_Maniac (Oct 17, 2015)

R-T-B said:


> My 6700k will kick most 4790k systems to the curb as well.



I have a hard time buying this statement since all the initial reviews of the 6000 chips, and DDR4 in general, showed no gain over a 4790k/DDR3 setup. This isn't the kind of thing that gets better with drivers as with new GPU models either.

I get the feeling some are just exaggerating to try to justify having spent more to get equal performance, just for a slightly better chipset. In that sense though, the benefits of Z170 are wasted on zero gain in CPU and RAM.

If one already has an adequate system, the smart thing to do is wait until both Zen and Pascal release, but 2015 has been pretty much a non eventful year for gaming performance improvements platform wise.


----------



## FordGT90Concept (Oct 17, 2015)

DDR4 will eventually reach 4,266 MT/s (34,128 MB/s)


----------



## cadaveca (Oct 17, 2015)

FordGT90Concept said:


> DDR4 will eventually reach 4,266 MT/s (34,128 MB/s)





What do you mean, eventually? You can buy those speeds now...


http://www.newegg.com/Product/Product.aspx?Item=N82E16820231956





As to the thread title, the 4790K gets 4.2 GHz on all cores, while 6700K gets 4.0 only. So the idea that they trade blows at times in benchmarks at stock... should have been rather obvious. But once you start clocking the 6700K, it leaves the 4790K trembling in fear of its life.


----------



## FordGT90Concept (Oct 17, 2015)

Overlocking...I'm talking standard which is currently at DDR4-2133.

Are there any CPUs/MBs that can handle that much throughput?


----------



## cadaveca (Oct 17, 2015)

FordGT90Concept said:


> Overlocking...I'm talking standard which is currently at DDR4-2133.


Meh. Same shite, eh? doesn't matter much from where I'm sitting. The latency reduction at those speeds is really nice. It won't be long before there are other modules that meet those specs for sure, but not with current platforms. Next-gen, it is looking pretty good.


----------



## Frag_Maniac (Oct 17, 2015)

cadaveca said:


> What do you mean, eventually? You can buy those speeds now...
> 
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16820231956
> ...



Yeah if you want to spend over $500 just on RAM and get CAS timings of *19*. RAM speed makes very little difference in gaming too. On DDR3 anything above 1866 is pretty much irrelevant.

And as far as I recall, the OCing overhead on the 6000 chips being marginal was one of the many complaints when they debuted. How would it be any better now? Oh wait, you're probably talking another $500 on a CPU WC kit. LOL

So what I'm getting from all this is the 4790k smacking performance claims are really just some syn bench numbers and nothing to do with gaming, and come at the expense of what you could build a pretty decent entire gaming rig for?


----------



## FordGT90Concept (Oct 17, 2015)

Hmm, higher turbo?  About the only thing I see that would cause 4790K to get better scores--would only apply to games that are largely single/dual threaded.

It's surprising wattage went up when process went down.


----------



## cadaveca (Oct 17, 2015)

Frag Maniac said:


> And as far as I recall, the OCing overhead on the 6000 chips being marginal was one of the many complaints when they debuted. How would it be any better now? Oh wait, you're probably talking another $500 on a CPU WC kit. LOL



Meh, my 6700K (in the Z170 launch review here) did 5 GHz under an H90. My 2nd does 4.9 GHz, both while not breaking 80c @ load. The same sort of voltage on 4790K would need that $500 WC kit, yep. That's where the 6700K excels... it's like the old 2600K that worked great with a $35 air cooler.

but that said, it's a mixed bag... OC is never guaranteed. But at stock, I don't see my 4790Ks beating my 6700K's in gaming at all.

Also, ram DOES matter to minimum FPS in games, especially when using multiple GPUs. I've been saying this for years, and now there are other sources outside myself who have tested and verified this. It's not HUGE gains, but there are gains to be had.


----------



## FordGT90Concept (Oct 17, 2015)

Yeah...DMI3 should provide a pretty big benefit to 6700K, especially if you're scratching on the limits of DMI2.


----------



## Shmoo (Oct 17, 2015)

All the reviews showing 4790K beating 6700K are pretty much irrelevant because Z170 is an entirely different chipset and has higher bandwidth RAM available for purchase. There is no true apples to apples comparison, especially once you make use of a faster RAM kit. The faster RAM kits seem to be what it takes to start pulling ahead significantly and really leave the older chips in the dust. You can buy DDR3 3100 if you really want to spend around $1K USD! So we should use slower RAM why? Just to prove that the higher DDR4 latency interferes enough to see little benefit, all while underclocking to 3.5GHz and using DDR4 2133/2400? I thought that the goal was to have a faster computer! Does it really matter what parts we have to combine or what we have to do to accomplish this?

I upgraded from i7-3770K by the way. If you are on Haswell or Haswell refresh then I can understand the hesitation. I would actually recommend skipping a generation unless you just have nothing better to spend your money on. My Cinebench R15 bench was with 6700K overclocked to 4.7GHz and mild OC on 980 Ti also. Without being overclocked, my 6700K got a 933cb if I remember right. Being just underneath i7-3930K is not too bad for a quad core. All I care about is having a faster computer overall than my 3770K rig. Most of my benches have been really close to the X99 chips, right up there. FPS in games are going to be mostly GPU dependent and when they are a little more CPU dependent, the faster RAM seems to really help. I wanted Z170 mostly for the storage connectivity upgrades, and in that department the Z170 chipset really shines. I wanted to retire my use of hard drives forever and so far it has been an amazing choice. SSD's may not effect gaming directly but I can download and install games much faster now on Samsung SM951.


----------



## R-T-B (Oct 17, 2015)

Frag Maniac said:


> I have a hard time buying this statement since all the initial reviews of the 6000 chips, and DDR4 in general, showed no gain over a 4790k/DDR3 setup. This isn't the kind of thing that gets better with drivers as with new GPU models either.
> 
> I get the feeling some are just exaggerating to try to justify having spent more to get equal performance, just for a slightly better chipset. In that sense though, the benefits of Z170 are wasted on zero gain in CPU and RAM.
> 
> If one already has an adequate system, the smart thing to do is wait until both Zen and Pascal release, but 2015 has been pretty much a non eventful year for gaming performance improvements platform wise.



"Kicking it to the curb" was me getting enthusiastic admitedly.  But I've owned both Skylake, Haswell, and Haswell-E.  With a proper FCLK Skylake is an improvement, admittedly mediocre but there all the same.

I can beat most any earlier gen likely given equal ASIC quality.  That's really about the extent of it, maybe a 2-5% margin.

Also, I'm talking raw CPU performance.  I never said anything about gaming.  Heck, games care very little about CPU these days.  I barely noticed my jump from my i7 990x.


----------



## Zoinho (Oct 17, 2015)

"Have 15 benchmarks: youtube.com/watch?v=NoFzi3WdNZo&list=PLC85l4CwqZZBGSYnXRv0d3nUzKP70CXMA"



FordGT90Concept said:


> You know all of these videos you're posting are useless.  They're showing realtime framerate at very high settings which says a lot of nothing.  All that any of us are going to care about is the range (min, max) and average.
> Only old games really need a faster CPU where they're limited to one or two cores.  Newer games are far more limited by the GPU.



Then why buy cpu faster?


----------



## FordGT90Concept (Oct 17, 2015)

For DMI3, DDR4, and TSX-IN.


----------



## Zoinho (Oct 17, 2015)

FordGT90Concept said:


> For DMI3, DDR4, and TSX-IN.



Did you understand that 6700k lose in many tests even with this features? "Have 15 benchmarks: youtube.com/watch?v=NoFzi3WdNZo&list=PLC85l4CwqZZBGSYnXRv0d3nUzKP70CXMA"


----------



## R-T-B (Oct 17, 2015)

Zoinho said:


> Did you understand that 6700k lose in many tests even with this features? "Have 15 benchmarks: youtube.com/watch?v=NoFzi3WdNZo&list=PLC85l4CwqZZBGSYnXRv0d3nUzKP70CXMA"



That's chipset.  It offers things like USB3, better bandwidth for storage devices.  It has nothing to do with gaming FPS.  You are missing his point.

And I already debunked most of those benchmarks as I guarantee you they haven't flashed the latest bios and in some cases, are probably even aware they have not because they are trying to prove something.  Thus, they are suffering from the 800Mhz FCLK issue that is easily worked around with a new board or bios flash.

Youtube in general is not a good place to draw reviews from.

A better way to evaluate this would be a CPU oriented benchmark anyways.


----------



## FordGT90Concept (Oct 17, 2015)

TSX-IN could be huge for games in the future.  It improves multithreaded locking performance.


----------



## Zoinho (Oct 17, 2015)

FordGT90Concept said:


> TSX-IN could be huge for games in the future.  It improves multithreaded locking performance.



The correct is TSX-NI and the 4790k has too.


----------



## FordGT90Concept (Oct 17, 2015)

My bad, and no, it does not.


----------



## Zoinho (Oct 17, 2015)

FordGT90Concept said:


> My bad, and no, it does not.



https://communities.intel.com/message/280337


----------



## Frag_Maniac (Oct 17, 2015)

cadaveca said:


> ...that said, it's a mixed bag... OC is never guaranteed. But at stock, I don't see my 4790Ks beating my 6700K's in gaming at all.


More to the point, does the 6700k consistently beat the 4790k at stock speeds in gaming? After all, with each new generation of chip, isn't that really the question?

While you ARE being realistic enough to acknowledge OCing is variable and therefore should not be used to compare performance, you really aren't comparing gaming performance at all, just clock speeds. It makes it sound like one of those dime a dozen silly world record attempts, where all they care about are clocks, with no regard for real world performance. It's quite a bit more money to spend on 1151 MBs, the 6000 series chips, and DDR4, to only get jacked about OCed clocks.





R-T-B said:


> I never said anything about gaming.  Heck, games care very little about CPU these days.  I barely noticed my jump from my i7 990x.


Which is why blanket statements Like "kick it to the curb" are so misleading as I implied. As far as game benching goes, it depends a lot which ones you test. Some are obviously going to be more CPU dependent than others.

It just seems like we're seeing some dodgy excuses now that Intel's new gen of chip comes wrapped in a nice little feature rich chipset and new RAM standard. Suddenly little to no gaming performance gains are considered the norm, whereas in the past, you could usually count on a noticeable improvement.

As I've said before, I really hope Zen beats the 6000 series chips on price to performance and gains AMD back some respect, while taking Intel down off their complacency a bit.


----------



## Zoinho (Oct 17, 2015)

Frag Maniac said:


> More to the point, does the 6700k consistently beat the 4790k at stock speeds in gaming? After all, with each new generation of chip, isn't that really the question?
> 
> While you ARE being realistic enough to acknowledge OCing is variable and therefore should not be used to compare performance, you really aren't comparing gaming performance at all, just clock speeds. It makes it sound like one of those dime a dozen silly world record attempts, where all they care about are clocks, with no regard for real world performance. It's quite a bit more money to spend on Skylake, the 6000 series chips, and DDR4, to only get jacked about OCed clocks.Which is why blanket statements Like "kick it to the curb" are so misleading as I implied. As far as game benching goes, it depends a lot which ones you test. Some are obviously going to be more CPU dependent than others.
> 
> ...



After two generations, the old 4790k still beats the new 6700k ... We spend time in games, or no? Then it is the important ! In 15 benchmarks, the Old 4th generation (with ddr3) Beats the New 6th gen with ddr4! : youtube.com/watch?v=NoFzi3WdNZo&list=PLC85l4CwqZZBGSYnXRv0d3nUzKP70CXMA


----------



## R-T-B (Oct 18, 2015)

Zoinho said:


> https://communities.intel.com/message/280337



That's a rare circumstance and it could cause many issues.  There's a reason it's usually microcode disabled, which is an on silicon bug.

As for the continuous posting of questionable youtube reviews, I'm choosing not to respond to those because you continuously ignore my points that effectively invalidate them.

And I know you find this hard to believe, but not everyone spends their time in games, actually.


----------



## cadaveca (Oct 18, 2015)

Frag Maniac said:


> More to the point, does the 6700k consistently beat the 4790k at stock speeds in gaming? After all, with each new generation of chip, isn't that really the question?
> 
> While you ARE being realistic enough to acknowledge OCing is variable and therefore should not be used to compare performance, you really aren't comparing gaming performance at all, just clock speeds. It makes it sound like one of those dime a dozen silly world record attempts, where all they care about are clocks, with no regard for real world performance. It's quite a bit more money to spend on 1151 MBs, the 6000 series chips, and DDR4, to only get jacked about OCed clocks.Which is why blanket statements Like "kick it to the curb" are so misleading as I implied. As far as game benching goes, it depends a lot which ones you test. Some are obviously going to be more CPU dependent than others.



Meh. I really do feel the 6700K is better all around than 4790K. Keep in mind that my hardware is free and I can choose any platform to build my own gaming machine with, and I chose X99. To me, SKT-1151 is for people that are broke and can't afford real gaming hardware.


----------



## GreiverBlade (Oct 18, 2015)

cadaveca said:


> Meh. I really do feel the 6700K is better all around than 4790K. Keep in mind that my hardware is free and I can choose any platform to build my own gaming machine with, and I chose X99. To me, SKT-1151 is for people that are broke and can't afford real gaming hardware.


/sarcasme  (at its finest, unless i am mistaken  )

my 6600K is fine .... thanks  and better than my fried ex-4690K also

edit: surprisingly i have a friend who really think like that ... well his 5820K makes not real difference over my 6600K, but his 980Ti SLI does ... tho in each "real life situation" test we did with my 980 used on both setup, the result is : he did pay way more than me for nearly exactly the same result, in games not in editing rendering and such, where a X99 setup take the advantage it should, and the best part? he actually do nothing else than playing game on it ... nope not even live streaming.


----------



## FordGT90Concept (Oct 18, 2015)

cadaveca said:


> Meh. I really do feel the 6700K is better all around than 4790K. Keep in mind that my hardware is free and I can choose any platform to build my own gaming machine with, and I chose X99. To me, SKT-1151 is for people that are broke and can't afford real gaming hardware.


But, but, but, what game put >4 cores to work?


----------



## Zoinho (Oct 18, 2015)

FordGT90Concept said:


> But, but, but, what game put >4 cores to work?


F1 2014, F1 2015, Strategic Games like Total War: Rome II, and others.


----------



## darkangel0504 (Oct 18, 2015)

i5 2550K @4,5 is good for future games ?

so should I upgrage my CPU ?


----------



## FordGT90Concept (Oct 18, 2015)

Zoinho said:


> F1 2014


Would run perfectly perfect on three cores.



Zoinho said:


> F1 2015


Not finding good information on this one but looking at the game and its predecessor, I'd be shocked if it needed more than a quad-core.



Zoinho said:


> Total War: Rome II


Perfectly perfect on a quad core.  People wish it would use more but, it doesn't.


----------



## Frag_Maniac (Oct 18, 2015)

cadaveca said:


> Meh. I really do feel the 6700K is better all around than 4790K. Keep in mind that my hardware is free and I can choose any platform to build my own gaming machine with, and I chose X99. To me, SKT-1151 is for people that are broke and can't afford real gaming hardware.




I'm hearing "my hardware is free" much louder than "better all around", esp regarding gaming. Even review sites evaluate with price factored in, because they know most people reading their reviews are actual consumers of the product.

The bottom line is, if it were really consistently better in gaming, there would be a LOT of benches proving it, and there aren't.


----------



## dorsetknob (Oct 18, 2015)

R-T-B said:


> And I know you find this hard to believe, but not everyone spends their time in games, actually.



And those that Do Go to Game Orientated Sites  They only come here to Technical sites when their ufooltube site cannot answer their Problems


----------



## GreiverBlade (Oct 18, 2015)

Frag Maniac said:


> I'm hearing "my hardware is free" much louder than "better all around", esp regarding gaming. Even review sites evaluate with price factored in, because they know most people reading their reviews are actual consumers of the product.
> 
> The bottom line is, if it were really consistently better in gaming, there would be a LOT of benches proving it, and there aren't.


well ...

notice that cadaveca defines himself as sarcastic 98% of the time (iirc ...) you took that post seriously???? aaawwww i thought my post cleared that  


GreiverBlade said:


> /sarcasme  (at its finest, unless i am mistaken  )
> 
> my 6600K is fine .... thanks  and better than my fried ex-4690K also
> 
> edit: surprisingly i have a friend who really think like that ... well his 5820K makes not real difference over my 6600K, but his 980Ti SLI does ... tho in each "real life situation" test we did with my 980 used on both setup, the result is : he did pay way more than me for nearly exactly the same result, in games not in editing rendering and such, where a X99 setup take the advantage it should, and the best part? he actually do nothing else than playing game on it ... nope not even live streaming.


technically my 6600K setup was free ... insurance refund and things of the sort  also .... i saw a lot of spreadsheet and graphics in computer press ... the only one who really are above a 6700K or a 6600K
example 1 Batman Arkham Origin and BF Hardline: none ... 1st and 2nd place : 6700K 6600K
example 2 Total war: Rome II and Compagny of heroes: 1st 5960X 2nd 6700K 3rd 5930K 4th 6600K 5th 5820K
(wait ... my friends 5820K is actually under my 6600K woohooo happy time bahahah  1fps less tho ahah! )


----------



## Shmoo (Oct 18, 2015)

1. The 6700K in that video I watched was underclocked to 3.5GHz. I'm not sure how many underclockers there are out there...but I am shooting for faster myself.
2. They are most likely using DDR4 2133 or DDR4 2400 because in the reviewers mind that makes them even...not true...latency is much higher on the DDR4 which gives it a handicap right off the bat, no matter how small or big. It would be hard to monitor how often and how quickly each game can access that RAM.
3. CPU + RAM is what determines speed difference (if it's same GPU) and there are no two RAM kits that I am aware of that would be close enough for a fair comparison.
4. If you can run DDR3 2133 or DDR3 2400 at CL15 it might me a bit more of a fair match. Not sure if this is possible or not since I am too busy making computers faster instead of slower! Why would anyone want to cripple the 4790K if they are trying to prove the opposite though? This is biased, regardless of which side you take..
5. There is no 100% fair way to compare between the two platforms. Different socket, different chipset. Even regardless of RAM.
6. All these reviews have been influenced by the half truth that DDR4 is slower than DDR3. That is true for the slower DDR4 speeds. Once you get higher bandwidth DDR4 it is no longer the case. Don't take my word for it, research it or ask the actual manufacturers and vendors! Moving from one bandwidth up to the next is not a big difference. Jumping from 8GB DDR3 1866 to 16GB DDR4 3000, I can definitely tell the difference.
7. I am sure that there will be plenty that try to convince us that the RAM does not make that big of a difference. If it determines the overall speed of the computer then it _does _make a difference. Even the latency makes a difference. No matter how small that difference is calculated to be, it will be an advantage or disadvantage...depending which side you want to win.
8. Never mind that the 6700k overclocks like a champ with higher voltage and better cooling. Let's try to prove it is not worth buying by crippling it's clock speed and using slower RAM...yeah okay, whatever helps you sleep better at night!


----------



## RejZoR (Oct 18, 2015)

RAM speed doesn't really affect anything. Downclock does however...


----------



## Zoinho (Oct 18, 2015)

Shmoo said:


> 1. The 6700K in that video I watched was underclocked to 3.5GHz. I'm not sure how many underclockers there are out there...but I am shooting for faster myself.
> 2. They are most likely using DDR4 2133 or DDR4 2400 because in the reviewers mind that makes them even...not true...latency is much higher on the DDR4 which gives it a handicap right off the bat, no matter how small or big. It would be hard to monitor how often and how quickly each game can access that RAM.
> 3. CPU + RAM is what determines speed difference (if it's same GPU) and there are no two RAM kits that I am aware of that would be close enough for a fair comparison.
> 4. If you can run DDR3 2133 or DDR3 2400 at CL15 it might me a bit more of a fair match. Not sure if this is possible or not since I am too busy making computers faster instead of slower! Why would anyone want to cripple the 4790K if they are trying to prove the opposite though? This is biased, regardless of which side you take..
> ...



"Let's try to prove it is not worth buying by crippling it's clock speed and using slower RAM..."
slow RAM 4790k(ddr3 1.6ghz) beats the 6700k(ddr4 3.0GHz) high RAM: youtube.com/watch?v=NoFzi3WdNZo&list=PLC85l4CwqZZBGSYnXRv0d3nUzKP70CXMA


----------



## Shmoo (Oct 19, 2015)

Zoinho said:


> "Let's try to prove it is not worth buying by crippling it's clock speed and using slower RAM..."
> slow RAM 4790k(ddr3 1.6ghz) beats the 6700k(ddr4 3.0GHz) high RAM: youtube.com/watch?v=NoFzi3WdNZo&list=PLC85l4CwqZZBGSYnXRv0d3nUzKP70CXMA



I am never going to believe that in a million years, sorry! Nowhere in the video does it say what RAM was used and even if it said that I still would not believe it. I also notice that the 6700K has much better thermal performance at that clock speed!


----------



## R-T-B (Oct 19, 2015)

youtube is full of "reviewers" with an axe to grind.  They probably set out at the get go to prove the 4790k is better.  In philosophy this is referered to as a "self-fulfilling prophecy."  In science, it's a violation of the scientific method.

Doesn't take long to realize by looking at reputable review sites how invalid this whole thread is.


----------



## FordGT90Concept (Oct 19, 2015)

RejZoR said:


> RAM speed doesn't really affect anything. Downclock does however...


It could because of DDR4's extra latency. DDR4 has to run ridiculously high bandwidth to compensate for it compared to DDR3 but this happened with DDR2 and DDR before it as well.  At the end of the day, bandwidth is more important.



Zoinho said:


> "Let's try to prove it is not worth buying by crippling it's clock speed and using slower RAM..."
> slow RAM 4790k(ddr3 1.6ghz) beats the 6700k(ddr4 3.0GHz) high RAM:


I believe you meant DDR3-1600 MT/s (standard memory speed) which is 8-8-8 to 11-11-11 at 12,800 MB/s and DDR4-3000 MT/s (overclocking memory speed) which is 15-15-15 to 16-17-17 at 24,000 MB/s.  It's not apples to apples.  DDR4-2166 MT/s  is standard for Skylake.  Skylake's memory controller also supports DDR3L.


----------



## Vayra86 (Oct 19, 2015)

When people need Youtube to prove a point, you know they have none.

Move along.


----------



## yogurt_21 (Oct 19, 2015)

thought this was settled already. 4790k vs 6700k are about the same on the same clockspeed. The Z170 vs Z97 is where the difference happens. Cpu's being equal chipset plus memory being better the net goes towards Z170. The fact that it's not worth it to jump to Z170 from Z97 doesn't matter. When was the last time it made financial sense to jump gens in intel cpu's for gaming?

2500k vs 3570k
2600k vs 3770k 

performance wise it made no financial sense to make either of these upgrades.

3570k vs 4570k
3770k vs 4770k

many thought this was actually a downgrade being the 3 series clocked better.

4570k vs 4690k
4770k vs 4790k

no financial sense.

Broadwell didn't make a gaming high end so theres nothing to compare to there so this is the next gen, not a 2 gen jump.

so it stands to reason that 4790k vs 6700k makes no financial sense either.

Were there reasons to upgrade outside of raw cpu performance? yes. Would it make much more sense to stay on your setup a few gens and then upgrade? yes

Think about it even a 2500k vs 4690k isn't the best use of your money, you'd likely get more bang for the buck out of a gpu upgrade, or a hd upgrade.

why is this even a debate?


----------



## dorsetknob (Oct 19, 2015)

yogurt_21 said:


> why is this even a debate?


Because new member @Zoinho  insists on revisting Spewtube for his info with every reply every reply more of the same


Vayra86 said:


> When people need Youtube to prove a point, you know they have none.
> 
> Move along.


Best Quote in this Thread   may i add   "" Mods Lock this Sucker as well""!!!!


----------



## cadaveca (Oct 19, 2015)

yogurt_21 said:


> why is this even a debate?



It's not. If you do not have both CPUs in this discussion (and I want a pic to prove it), then your opinion is purely conjecture. (I'm speaking generally here).

YouTube views = money for the channel. That's all.

I can build two systems right now with each of these chips and post results easily, ending this discussion in a second. I choose not too, simply because the arguments here are comical to me. There are other users here that could do the same.

So I'll say, Zoinho, where's your picture of your chips? Until you supply that, and post real results, then your posts are against forum rules, and will have you banned. All it takes in someone to report your posts as FUD....


----------



## dorsetknob (Oct 19, 2015)




----------



## TheHunter (Oct 19, 2015)

IMO 6700K only starts to matter with DDR4 if you have at least 3GHz "low" latency e.g. CL14-CL15 kit, then its this extra 5-7GB/s saving it against e.g. DDR3 2400MHZ CL10 kit..



Shmoo said:


> I have i7-6700K + EVGA GTX 980 Ti Classified and get some pretty amazing benchmarks at stock clocks. Will post any bench you would like to see upon request. I'm trying to wait until I get all the gear for my CPU + GPU custom liquid cooling loop before I really cut loose overclocking. Only thing I am frustrated about is that I ended up with two i7-6700K's. I auctioned one off on eBay and the winner did not pay. So...I am stuck on BenQ 1080p 144Hz until I can sell the extra one or get a refund. Then I am planning on BenQ 1440p 5ms IPS. When Pascal hits I'll probably do 2160p.



Was wondering what it scores by openGL, guess its as good as it gets there..

My personal max by openGL was 194fps, here its 192fps, gpu stock 954mhz, cpu 4.6ghz.


----------



## R-T-B (Oct 19, 2015)

cadaveca said:


> I can build two systems right now with each of these chips and post results easily, ending this discussion in a second. I choose not too, simply because the arguments here are comical to me. There are other users here that could do the same.



I'd like to think my arguments about the FCLK issue have been a little more than "comical"

Though, there surely has been some comedy sources around here....

Moving on, I have a 5820k in one of my builds still.  If it weren't for the DDR4 I'd bench it and call it "good enough to compare"  I don't though, and fully admit such.  I can't be certain I'm correct on this assumption anymore than I can be certain the moon landings were real (I didn't do it, etc), but I can be pretty darn well close to it.

What we really should do is find a 4790k user and have them post a 4Ghz bench.  I'll match the bench with my Skylake system.  Then we'll see.  Offers on the table.


----------



## Frag_Maniac (Oct 19, 2015)

Shmoo said:


> All these reviews have been influenced by the half truth that DDR4 is slower than DDR3. That is true for the slower DDR4 speeds. *Once you get higher bandwidth DDR4 it is no longer the case*.



And once you do that, you make it an even more expensive platform upgrade, for what are pretty much non gaming performance differences still.

I've yet to see one person justify the expense of Skylake gaming wise, and the reason that is upsetting to many, including even those like myself whom are on older platforms, is Skylake is supposed to be a budget platform.

The 6000 series chips are overpriced, the 1151 MBs are over priced, and the DDR4 RAM is certainly over priced, while the combined lot of them under perform.

It doesn't take a genius to figure out that for most gamers, that is not acceptable, esp in a budget platform.


----------



## cadaveca (Oct 19, 2015)

R-T-B said:


> I'd like to think my arguments about the FCLK issue have been a little more than "comical"
> 
> Though, there surely has been some comedy sources around here....
> 
> ...



I have both chips, and I also have 3100 MHZ DDR3 and 3200 MHz DDR4. I have boards, and can use the same VGA in both systems. Heck, I could set both up and have them benching simultaneously. Both chips are 4.0 GHz stock. 4790K has higher Turbo.

Here some benches I did:


http://www.techpowerup.com/reviews/MSI/Z170A_GAMING_M7/11.html


Note that the 4790K did not win at any point. All systems are "stock", so we can end the whole DDR3 vs DDR4 crap.


----------



## Frag_Maniac (Oct 19, 2015)

cadaveca said:


> I have both chips, and I also have 3100 MHZ DDR3 and 3200 MHz DDR4. I have boards, and can use the same VGA in both systems. Heck, I could set both up and have them benching simultaneously. Both chips are 4.0 GHz stock. 4790K has higher Turbo.
> 
> Here some benches I did:
> 
> ...


Yet earlier you claimed better all around, yet nowhere there do you have any gaming benches. Enough said.


----------



## TheHunter (Oct 19, 2015)

cadaveca said:


> I have both chips, and I also have 3100 MHZ DDR3 and 3200 MHz DDR4. I have boards, and can use the same VGA in both systems. Heck, I could set both up and have them benching simultaneously. Both chips are 4.0 GHz stock. 4790K has higher Turbo.
> 
> Here some benches I did:
> 
> ...



But as you know DDR3 after 2400MHz loses its bandwidth unlike DDR4 that starts to matter after 2800 or 3000MHZ.

Ok expect rare mobos that can set timings right and have 40GB/s @ DDR3 2666MHZ, so far I saw only one Asrock or was it Gigabyte mobo, all ASUS have shitty timings/bandwitdh above 2400MHZ.


----------



## cadaveca (Oct 19, 2015)

Frag Maniac said:


> Yet earlier you claimed better all around, yet nowhere there do you have any gaming benches. Enough said.


metro last light isn't a game? 3DMark isn't a game?  (kidding on the last one)

Spi32m and wPrime say a lot, as does cinebench. Toss up a list of games with benchmarks, maybe I'll run the ones I have. I have 6700K set up now for review testing already, the 4790K I'll have to put together, but since I only have a couple of reviews in the works right now, as soon as I am done, if no new samples arrive by the end of the week, I'll run a tonne of benches on the weekend and end this stupidity. There's no way you can lose in every benchmark, and then win in games. That's quite the interesting idea.



TheHunter said:


> But as you know DDR3 after 2400MHz loses its bandwidth unlike DDR4 that starts to matter after 2800 or 3000MHZ.
> 
> Ok expect rare mobos that can set timings right and have 40GB/s @ DDR3 2666MHZ, so far I saw only one Asrock or was it Gigabyte mobo, all ASUS have shitty timings/bandwitdh above 2400MHZ.



It's not boards, only specific memory ICs can do better above 2400 MHz. Z97 does have issues over 2400 for sure, no matter what board, but it's the type of sticks used that really matters.


----------



## Frag_Maniac (Oct 19, 2015)

cadaveca said:


> metro last light isn't a game?


Do you honestly think you can validate "better all around", including gaming, with ONE game?

Technically, my statement was correct, no gaming bench*es.*


----------



## TheHunter (Oct 19, 2015)

@cadaveca

I see,

this is what I mean






My PB with Crucial Balistx Elite CL9 2133MHZ was 36100MB/s @ 2400MHZ, anything higher it drops  a lot..


Do you know which mem IC/ sticks are ideal? G.skill, TeamGroup Extrem..
http://geizhals.eu/?cat=ramddr3&xf=254_2666~5830_UDIMM1~5828_DDR3~5831_DIMM~253_16384&sort=p


But now I see its crazy expensive, especially 2800MHZ+, DDR4 3000MHZ is cheaper now ~ 150€ or so..


EDIT: well guess G.skill tridentx is out of the question, they barely made it to 30GB/s mark..
http://www.tweaktown.com/reviews/55...8gb-dual-channel-memory-kit-review/index.html


----------



## EarthDog (Oct 19, 2015)

Jesus christ... how many reviews did a clock for clock comparison already across a slew of tests??? Sure there is ddr3 v ddr4, but that is a variable that almost can't be accounted for. We all (should) know that in the vast majority of cases, the larger pipe ddr4 offers doesn't benefit much of anything past negligible increases in benchmarking anyway.


----------



## cadaveca (Oct 20, 2015)

Frag Maniac said:


> Do you honestly think you can validate "better all around", including gaming, with ONE game?
> 
> Technically, my statement was correct, no gaming bench*es.*


I hear ya, but since I am not TPU's CPU reviewer, just tossing out benchmarks that don't relate to MOTHERBOARD performance is rather silly.

But here's the sticker... different motherboards have different Turbo profiles (as well as other performance-affecting things), leading to differing performance numbers, even with all other hardware the same. So most "smaller" differences are simply down to BIOS tuning, and don't relate actual true performance of the CPU installed into the board. That's why I chose to use many MSI GAMING motherboards for that review, since BIOS tuning is relatively equal. Since nearly no one even bothers to mention BIOS tuning as affecting performance, there's more to this than most consider. Creating a level playing field so that comparisons across systems are truly accurate is a very difficult thing.


----------



## Freezer (Oct 20, 2015)

Skylake is more of a reference CPU for future releases... I was highly disappointed this past spring when benchmarks began to leak, which is why I opted for Haswell-E over Skylake. I waited a LONG time for Skylake while running a Q6600. The Q6600 is still a beast in some scenarios but lacking with modern software.


----------



## RealNeil (Oct 20, 2015)

I have a 4770K, 4790K, and soon, a 5930K.
I still plan to get a 6700K. no way will I not.

I don't care if benches are up a little or a lot. I want the newest that I can afford.


----------



## FordGT90Concept (Oct 20, 2015)

cadaveca said:


> Note that the 4790K did not win at any point. All systems are "stock", so we can end the whole DDR3 vs DDR4 crap.


Except memory latency, but we expected that. XD


----------



## cadaveca (Oct 20, 2015)

FordGT90Concept said:


> Except memory latency, but we expected that. XD


Point of the matter is that there isn't a single benchmark, even with lowered CPU speed via Turbo, that the 6700K loses. Latency for DDR4 higher or not doesn't matter. 6700K still wins, as you can see in the benchmarks in my motherboard review.

So, tell me, how does a CPU that wins in all CPU-related tasks, end up slower in gaming? If CPU doesn't matter for a gaming test, then it would be up to the GPU, right?

Or is memory more important than anyone wants to admit?

Or perhaps, if 6700K does lose in a game, it's simply a BIOS issue?

This is the ultimate question of the thread. If such conditions do arise, why do they arise?

You see, I'm a bit tired of posting stuff that is a bit off from the mainstream, and having people argue against it. I mean, I don't really care about these things, to be honest; my only point in doing reviews from day one was to remove the marketing BS and to post the truth 100%. So to answer outlying questions like this, why should I bother? Let someone else put in the work. I got better things to do... like running all these tests and keeping the results to myself. 


ROFL.


----------



## Zoinho (Oct 20, 2015)

dorsetknob said:


>



Guys, the thread Title: "Skylake i7 6700K lose to Haswell i7 4790K in gaming?"
The answer: Yes. The 6700k lose to 4790k!. This is proved.

" After two generations, the old 4790k still beats the new 6700k ... We spend time in games, or no? Then it is the important ! In 15 benchmarks, the Old 4th generation (with ddr3) Beats the New 6th gen with ddr4! : youtube.com/watch?v=NoFzi3WdNZo&list=PLC85l4CwqZZBGSYnXRv0d3nUzKP70CXMA "


----------



## FordGT90Concept (Oct 20, 2015)




----------



## Zoinho (Oct 20, 2015)

Shmoo said:


> I am never going to believe that in a million years, sorry! Nowhere in the video does it say what RAM was used and even if it said that I still would not believe it. I also notice that the 6700K has much better thermal performance at that clock speed!


Here are the RAM speed of that videos: 6700k DDR4 3.0GHz https://youtu.be/8ZLrskdVEEM?t=25s vs 4790k DDR3 1.6ghz https://youtu.be/MO1cn4AUa4M?t=3s


----------



## dorsetknob (Oct 20, 2015)

Another booring spewtube video from @Zoinho still trying desperatly to make his point.


----------



## Zoinho (Oct 20, 2015)

dorsetknob said:


> Another booring spewtube video from @Zoinho still trying desperatly to make his point.


@dorsetknob


----------



## dorsetknob (Oct 20, 2015)

that is a cute pic  thanks

BTW   you still have not proved you have both processors and you have independently benched both processors   AS ASKED


cadaveca said:


> So I'll say, Zoinho, where's your picture of your chips? Until you supply that, and post real results, then your posts are against forum rules, and will have you banned. All it takes in someone to report your posts as FUD....


 to confirm your point  so at this point your full of fud
All you can do is post Dubious Spewtube Vids FROM OTHER PEOPLE

Edit
You got any more pictures of the kid   i miss watching my Bastards grow up

*Urban Dictionary: FUD*
http://www.urbandictionary.com/define.php?term=FUD
*Fud - Wikipedia, the free encyclopedia*
https://en.wikipedia.org/wiki/Female_urination_device
https://en.wikipedia.org/wiki/Fear,_uncertainty_and_doubt


----------



## Zoinho (Oct 20, 2015)

What is title of this thread? Who does not want to discuss the 6700K losing to 4790k, you should not follow this thread.


----------



## dorsetknob (Oct 20, 2015)




----------



## EarthDog (Oct 20, 2015)

Zoinho said:


> What is title of this thread? Who does not want to discuss the 6700K losing to 4790k, you should not follow this thread.


So, you are going to keep throwing up some random YT vids, ignoring all 'professional' reviews and what they found? Even though you (supposedly) have the chips to test it and either learn or shut these people up?

Dear lord, where is that ignore feature...



.. and why has this thread been left open? Every time I show up here, its a like a little more OCN seeps in and stays in...


----------



## tabascosauz (Oct 20, 2015)

Zoinho said:


> What is title of this thread? Who does not want to discuss the 6700K losing to 4790k, you should not follow this thread.



There's no discussion here that involves you, buddy. It's just you, the one who's linking the same goddamned youtube videos that don't prove anything, versus everyone else, who's trying to discuss.

Can you verify your claims through testing that you personally undertake with the 4790K and 6700K? If not, I'd advise that you go and do some testing for yourself before taking up the cudgels for the youtuber to whom the videos belong.

Also, the video doesn't prove anything. There's no noticeable difference in framerate or consistency. It was the same going from the 2700K to the 3770K, and from the 3770K to the 4770K. It's already established that PCIe clocks were the issue behind the initial lag behind the 4790K; it's all but resolved now. Your point?

It seems that there are all too many brand-new users joining this forum to eagerly promote what they think is comprehensive knowledge and educated opinion on their part. What happened to the unspoken principle of being respectful when wandering into a place/forum that one has never visited before?

*A complete newcomer telling older members to leave? Seems legit.*


----------



## Zoinho (Oct 20, 2015)

tabascosauz said:


> There's no discussion here that involves you, buddy. It's just you, the one who's linking the same goddamned youtube videos that don't prove anything, versus everyone else, who's trying to discuss.
> 
> Can you verify your claims through testing that you personally undertake with the 4790K and 6700K? If not, I'd advise that you go and do some testing for yourself before taking up the cudgels for the youtuber to whom the videos belong.
> 
> ...



Can you prove the results of benchmarks of any site? I repeat: What is the title of this thread? it is "It's already established that PCIe clocks..." no. The title is 6700k lose to 4790k in games. Final point. I just show the videos that prove the title of this thread.


----------



## cdawall (Oct 20, 2015)

cadaveca said:


> So, tell me, how does a CPU that wins in all CPU-related tasks, end up slower in gaming? If CPU doesn't matter for a gaming test, then it would be up to the GPU, right?



When it is limited by a slow PCI-e bus, case in point LGA775. I am not saying this is a current issue with the 6700k, just that single scenario would fulfill your question. There could very well be an issue with pci-e latency that is completely BIOS based. I own none of these systems so I have no way to test it.


----------



## dorsetknob (Oct 20, 2015)

we are all Still waiting for you to post your results 
Other peoples Results is just heresy 
PUT UP OR SHUT UP


----------



## Zoinho (Oct 20, 2015)

dorsetknob said:


> we are all Still waiting for you to post your results
> Other peoples Results is just heresy
> PUT UP OR SHUT UP


First Post yours.


----------



## cdawall (Oct 20, 2015)

Zoinho said:


> First Post yours.



I can post ones showing my FX9370 beating a 6700K. Don't ask clock speeds and I am cherry picking benchmarks. I am thinking something with true mulithreading in linux.


----------



## dorsetknob (Oct 20, 2015)

Your the Mouth thats Flapping in this thread i do not own either of these CPU so cannot
But then again Neither do you
ALL u want to do is post F>U>D

Why don't you just Trash the thread   so the mods lock it   that way your FUD is preserved


----------



## Frag_Maniac (Oct 20, 2015)

Freezer said:


> Skylake is more of a reference CPU for future releases... I was highly disappointed this past spring when benchmarks began to leak, which is why I opted for Haswell-E over Skylake. I waited a LONG time for Skylake while running a Q6600. The Q6600 is still a beast in some scenarios but lacking with modern software.



You are not alone my friend. I've been waiting for some time on an i7 950, and now I plan to wait and see what Zen can do.





Zoinho said:


> What is title of this thread? Who does not want to discuss the 6700K losing to 4790k, you should not follow this thread.


Seems to me you're glossing over the fact that this isn't just a tech forum, it's a very gaming oriented tech forum, and gaming wise, at best, Skylake is disappointing, and that includes DDR4. Every scenario painted here by Skylake advocates has it requiring a LOT of money dumped into, and still, just to gain mostly non gaming performance.

So then, I ask, what's the point of bothering with Skylake? Why not just opt for an enthusiast platform if you're going to turn what is supposed to be a budget platform into a money pit?


----------



## EarthDog (Oct 20, 2015)

cdawall said:


> I can post ones showing my FX9370 beating a 6700K. Don't ask clock speeds and I am cherry picking benchmarks. I am thinking something with true mulithreading in linux.


What is funny is he thanked your post... seemingly not catching on to the fact of the cherry picking, linux, etc... Good times..

Anyway: http://arstechnica.com/gadgets/2015/08/intel-skylake-core-i7-6700k-reviewed/

Seems like a 4Ghz Haswell and a 4GHz 6700K are the same here(link above) across these 4 titles, no? There are several reviews with similar results...



> Every scenario painted here by Skylake advocates has it requiring a LOT of money dumped into, and still, just to gain mostly non gaming performance.


The difference between Haswell/z97 and Skylake Z170, really isn't that much different. I think I built strikingly similar rigs for a difference of around $50. There is a slight premium for the CPU ($30) and the ram ($10). That leaves $10 for a slightly more expensive motherboard.


----------



## cadaveca (Oct 20, 2015)

Zoinho said:


> First Post yours.


I did, and asked you to do the same? No response? I wrote the review, ran those benchmarks. You posted YouTube videos that aren't your own? 2nd-hand information is pure conjecture.



EarthDog said:


> The difference between Haswell/z97 and Skylake Z170, really isn't that much different. I think I built strikingly similar rigs for a difference of around $50. There is a slight premium for the CPU ($30) and the ram ($10). That leaves $10 for a slightly more expensive motherboard.



Pricing and availability issue at launch have confounded this. The recent drops in DDR4 prices were really required in order to make the deal sweeter for sure. Even in the end, since Z170 has lower overall power consumption (based on my own testing), paying the extra $25 now or later shouldn't be an issue, either.


----------



## Zoinho (Oct 20, 2015)

Frag Maniac said:


> Seems to me you're glossing over the fact that this isn't just a tech forum, it's a very gaming oriented tech forum, and gaming wise, at best, Skylake is disappointing, and that includes DDR4. Every scenario painted here by Skylake advocates has it requiring a LOT of money dumped into, and still, just to gain mostly non gaming performance.
> 
> So then, I ask, what's the point of bothering with Skylake? Why not just opt for an enthusiast platform if you're going to turn what is supposed to be a budget platform into a money pit?



Even after two generations we no have gain... an old 4790k beats the new generation.


----------



## EarthDog (Oct 20, 2015)

Zoinho said:


> Even after two generations we no have gain... an old 4790k beats the new generation.


Its gaming... in a lot of titles, the CPU won't matter in the first place!! WTH were you expecting? Have you read any reviews of games that show, in many cases, very little differences between generations going back to S1366/Nehalem? Same with AMD CPUs too! The higher the res, the less the CPU means in a lot of titles (RPG/MMOs are a different story). 

Damn man, do a lot less posting and a lot more LEARNING, would ya? You really look incredibly foolish with your piss poor counterpoints... and its sourced in a nearly complete lack of knowledge on how things work and what results you can get.


----------



## dorsetknob (Oct 20, 2015)

cadaveca said:


> 2nd-hand information is pure conjecture.



Or in other words 
	

	
	
		
		

		
			







Zoinho said:


> What is title of this thread? Who does not want to discuss the 6700K losing to 4790k, you should not follow this thread.


@Zoinho
By the way   in case you did not Notice your not the Original Post Starter
So why should you have any say in who follows this thread


----------



## Zoinho (Oct 20, 2015)

dorsetknob said:


> Or in other words
> 
> 
> 
> ...





trodas said:


> According to the preliminary (NDA) and pulled down tests, Skylake is good for syntetic benches (109% over i7 4790K):
> 
> 
> 
> ...



@dorsetknob  First Post of "Original Post Starter" is here. I repeat: What is the theme or title of this thread?


----------



## dorsetknob (Oct 20, 2015)

dorsetknob said:


> @Zoinho
> By the way in case you did not Notice your not the Original Post Starter
> So why should you have any say in who follows this thread



And Many posters have Debunked the claim  EXCEPT YOU AND YOUR SPEW TUBE TIRADE


----------



## sneekypeet (Oct 20, 2015)

Only Warning, stay on topic, and keep the personal comments to yourselves!
Anything else that brings me here will be awarded with points


----------



## Shmoo (Oct 20, 2015)

All I can say is that I am happier than I have ever been on any platform in the past, even for gaming. This is why I felt compelled to voice my opinion. Of course my 980 Ti doesn't really hurt matters either though! I have never been one who likes to argue in forum threads, everyone is entitled to form their own opinion. If it is not the RAM or RAM latency slowing it down in those comparisons, then maybe it's just the chipset or a combination of a few things? Even at stock clocks I am very impressed when gaming or doing anything pretty much. I got an 82.3% ASIC score on my 980 Ti. I can't wait to get my custom loop going so I can overclock more comfortably. I would highly recommend my combination of parts or similar. _Lightning fast_ PC and never been happier!


----------



## Frag_Maniac (Oct 21, 2015)

EarthDog said:


> ...in many cases, very little differences between generations going back to S1366/Nehalem?



But typically there's always been a gain, sometimes a substantial one, whereas here there's no gain for gaming, and in some cases even worse performance.

People are going to vote with their wallets on this one, and for anyone not getting squat for performance gain over what they have now, Zen looks worthwhile waiting for.

Everything that's been said indicates the only ones interested in Skylake are those whom are at entry enthusiast level budgets, and news flash, that's a smaller niche market than you typically see for budget platforms.

So it's not just a lot of review sites saying many will pass on it, more importantly, a lot of consumers are not impressed with it, and rightfully so.

The timing is right for a good comeback for AMD.


----------



## EarthDog (Oct 21, 2015)

There are (meager) gains in applications. And typically, from SB->IB->Haswell, we didn't see much of anything at a clock for clock level between them.

I agree with those reviews, particularly Anand's that state if you are 2600K or below, its worth the upgrade. 

Again, depends on the title as to what gains, if any will come from the CPU. Look at BF4 for an example: http://www.techspot.com/review/734-battlefield-4-benchmarks/page6.html
Notice no difference between 3770K and 4770k? Look at the I7 920 hanging in there with 500Mhz less on the clocks and SIGNIFICANT gains between it and 4770K...3 FPS difference!

DA:I - http://www.techspot.com/review/921-dragon-age-inquisition-benchmarks/page6.html
TW3 - http://www.techspot.com/review/1006-the-witcher-3-benchmarks/page5.html (4670K and 2500K)

Meager gains if not attributed to clockspeed, no? 

Not sure what the big deal is since we haven't seen worthy jump in gaming performance since SB over Nehalem.


----------



## RealNeil (Oct 21, 2015)

Some of the older platforms were outstanding. My i7-2600K is still able to crank out the FPS with decent GPUs inside. My i7-870 is long in the tooth even though it still runs well.

But I'm keeping three systems as upgraded as possible (one for me, two for the grandkids) and I don't want them to be too far behind the latest generation hardware if possible.
So I'll try to swap out my 4770K, my 4790K, and my FX-9590 boxes for Skylake i7s over the next 5 or 6 months. My GPUs are good for a year, maybe a year and a half.


----------



## Frag_Maniac (Oct 22, 2015)

EarthDog said:


> Look at the I7 920 hanging in there...we haven't seen worthy jump in gaming performance since SB over Nehalem.



Seriously contradicting yourself there. After all, the 920 IS Nehalem.

What I don't see a point to is a few that bit on Skylake, some blatantly admitting for free, grandstanding like it's a revelation. I think the public can decide for themselves.

I've already seen  LOT of people echo that they're waiting to check out Zen, pretty much every time I mention it.


----------



## FordGT90Concept (Oct 22, 2015)

I think 920 to 6700K is worth it.  I'd maybe consider 2600K to 6700K worth it too but other than that, nope.  Comparing 4### to 6### doesn't make any sense either because a) people with 4### shouldn't upgrade to 6### and b) people looking to buy something new now will look at 5### (laptop) or 6### (desktop), not 4###.  If you're in a position to wait until 2017, yeah, stick it out for Zen or Cannonlake.


----------



## EarthDog (Oct 22, 2015)

Frag Maniac said:


> Seriously contradicting yourself there. After all, the 920 IS Nehalem.
> 
> What I don't see a point to is a few that bit on Skylake, some blatantly admitting for free, grandstanding like it's a revelation. I think the public can decide for themselves


Context and comprehension.... I was talking about gaming there, also the subject of the thread . Clearly the 920 gets worked by modern architecture in everything else. In several games, ones that don't rely on the cpu as much, it can hang in there.

If you think people can decide for themselves, you have too much faith in humanity. The average consumer is an idiot and count tell the difference between sandybridge, ivy bridge and the Brooklyn bridge. Lol!


----------



## Frag_Maniac (Oct 22, 2015)

EarthDog said:


> *The average consumer is an idiot* and count tell the difference between sandybridge, ivy bridge and the Brooklyn bridge. Lol!



Yet here *you* are advocating a so called budget platform that realistically has entry level enthusiast pricing once configured optimally, only to get equal performance in gaming.

The reality is, new CPU gens *are *a typically low performance increase, 5-15% at best on average, but the difference is, there typically *is* an increase, and *not* at enthusiast pricing.

If you can't see that the two of those factors combined makes Skylake an epic fail for most gamers, than you need to seriously redefine your perception of the word idiot. There may be one staring back at you in the mirror, that can't for the life of him remember where his money went.

For every one consumer out there that is a little naive about components, there are 10 more that think nothing of pissing their money away wastefully just to have the latest crap, even if it isn't the greatest. At the end of the day, bragging rights only make sense if you actually *have* something to brag about.


----------



## cdawall (Oct 22, 2015)

The average consumer doesn't understand why a pentium at 2.16ghz is slower than an i7 at 2.0. Regardless of generation.


----------



## Frag_Maniac (Oct 22, 2015)

It doesn't really matter how tech oriented the average consumer is. I see more and more naive gamers every day taking to tech forums and asking for advice, and trust me, most of them are being told to avoid Skylake.

It was said just a few posts ago that Skylake only makes sense if you have an older CPU/platform, nope. Check the forums like Toms, people are still advocating 1150, 4690k, DDR3, etc, over Skylake because they get MUCH better gaming value.

And as far as app performance, a decent SSD still makes more sense than wasting a lot of money on Skylake.


----------



## cdawall (Oct 22, 2015)

Tom's is full of morons


----------



## EarthDog (Oct 22, 2015)

Frag Maniac said:


> Yet here *you* are advocating a so called budget platform that realistically has entry level enthusiast pricing once configured optimally, only to get equal performance in gaming.
> 
> The reality is, new CPU gens *are *a typically low performance increase, 5-15% at best on average, but the difference is, there typically *is* an increase, and *not* at enthusiast pricing.
> 
> ...


wait, what? What are you going on about? PM me...


----------



## Frag_Maniac (Oct 22, 2015)

cdawall said:


> Tom's is full of morons



Is a naive person that has the restraint and common sense to ask for help and actually learn something more of a "moron" than someone that wastes a couple hundred on an equal performing platform, settling for a GPU 1-2 models lower in the process?

The way I see it, the morons are the ones in denial. They can be of high intelligence level, but lack the common sense to use it effectively.

The reality is, Tom's is full of naive gamers looking for help, but it's never more moronic when a handful of elitists jump in a thread and try and convince them they should spend hundreds more than they need to, without really gaining any performance.


----------



## cdawall (Oct 22, 2015)

Frag Maniac said:


> Is a naive person that has the restraint and common sense to ask for help and actually learn something more of a "moron" than someone that wastes a couple hundred on an equal performing platform, settling for a GPU 1-2 models lower in the process?
> 
> The way I see it, the morons are the ones in denial. They can be of high intelligence level, but lack the common sense to use it effectively.



Oh I don't care about the people asking. Most of the clowns answering however are 15 year old kids who are idiots.


----------



## Frag_Maniac (Oct 22, 2015)

cdawall said:


> Oh I don't care about the people asking. Most of the clowns answering however are 15 year old kids who are idiots.



I can see that you either don't chat there much, or chat on boards that are pointless to hang out on. The number of members that know what is sensible value in gaming components far outweigh those that don't have a clue what to advise.

I've seen just as many ill advised suggestions on elitists tech forums as I have on Tom's. You have to know how to sift through the crap no matter what forum you visit.

Just the fact that you assume a consumer has to have deep tech knowledge of the components to make a sensible purchase is quite an oversight in judgment in itself.

A lot of people here no matter what tech stuff they know, still refer to benches, benches most anyone can understand. I think you're arguing yourself into a corner here.


----------



## Zoinho (Oct 22, 2015)

Guys, return to the title of this thread... Post yours benchmarks that prove 6700k vs 4790k, but do not believe DigitalFoundry because they were paid by intel... Many others tests prove that 4790k over 6700k in gaming.


----------



## sneekypeet (Oct 22, 2015)

Frag Maniac said:


> Is a naive person that has the restraint and common sense to ask for help and actually learn something more of a "moron" than someone that wastes a couple hundred on an equal performing platform, settling for a GPU 1-2 models lower in the process?
> 
> The way I see it, the morons are the ones in denial. They can be of high intelligence level, but lack the common sense to use it effectively.
> 
> The reality is, Tom's is full of naive gamers looking for help, but it's never more moronic when a handful of elitists jump in a thread and try and convince them they should spend hundreds more than they need to, without really gaining any performance.





cdawall said:


> Oh I don't care about the people asking. Most of the clowns answering however are 15 year old kids who are idiots.



Tread lightly here fellas, warnings were already given for name calling and making things personal.


----------



## EarthDog (Oct 22, 2015)

Zoinho said:


> Guys, return to the title of this thread... Post yours benchmarks that prove 6700k vs 4790k, but do not believe DigitalFoundry because they were paid by intel... Many others tests prove that 4790k over 6700k in gaming.


we are still waiting for you to do that...

I also posted some results from another site that showed differently.......but you missed it?


----------



## tabascosauz (Oct 22, 2015)

@sneekypeet Why the need for more warnings? This thread is like






One guy thinks he owns the place and believes he has the right to tell others to "leave" when he is unable to see their reasoning; DigitalFoundry is "paid by Intel" but Youtube "benchmarks" are holiest of the holiest. Why not let him just have the [locked] thread to himself?


----------



## sneekypeet (Oct 23, 2015)

tabascosauz said:


> @sneekypeet Why the need for more warnings? This thread is like
> 
> 
> 
> ...



So now as mods we have to run around taking toys away because all of you do not know how to ignore members and get along? What is the point in having open opinion forums then? If warnings are not needed, I will be giving you points for posting off topic, enjoy!


----------



## Zoinho (Oct 23, 2015)

Compare... a 4790k of the lying DigitalFoundry (youtu.be/4sx1kLGVAF0?t=1m2s) vs a Real benchmark of 4790k
(youtu.be/S0uDzU8KTiQ?t=7m35s), and note that this Real result exceeds the 6700k OC.


----------



## GreiverBlade (Oct 23, 2015)

Zoinho said:


> Compare... a 4790k of the lying DigitalFoundry (youtu.be/4sx1kLGVAF0?t=1m2s) vs a Real benchmark of 4790k
> (youtu.be/S0uDzU8KTiQ?t=7m35s), and note that this Real result exceeds the 6700k OC.


what about a big "we don't care" unlike you (unless you do own them both at the same time/not at the same time and stop drinking blindingly some youtube nonsense ) i owned both CPU from Haswell DT and Skylake S (unfortunately not at the same time since my 4690K fried and the insurance covered for the original price which was coincidentally the same price that i would have to pay for a similar 6600K setup, previously 16gb DDR3 C10 2400 now 16gb DDR4 C14 2800) between the 2 setup in game i use to play, guess what? no differences at all.

Real life situation versus Benchmark? well the Benchmark mean nothing in the end except a "general idea" that does not even reach the result of a normal day to day use

a benchmark not gaming ... who care
wait .... my 6600K is 13% faster in single thread and only 11% slower in multi thread than a 4790K, eh? oh wait i OC to 4.4 instead of 4.0, not really caring about that, are you? obviously no, because if you have a 4690K or a 4790K a 6600K or a 6700K is pointless (well for me it was, until a certain thunderstorm ... )




this thread in the end is pointless ... the 6700K and even the 6600K are great value for anyone who have a CPU under a 3770K (or enthusiast) which is a great news for average user since if they have a Ivy or later (or even older CPU depending the needs they have) they do not need to change and buy more (at last in switzerland they are great value ... since the Z97 and Haswell DT CPU prices are going a bit up and almost reach the prices of their Skylake S equivalent )


----------



## Zoinho (Oct 23, 2015)

Dude, What is title of this thread? But if you want synthetic benchmark: https://www.cpubenchmark.net/singleThread.html and the benchmark definition: 
http://searchcio.techtarget.com/definition/benchmark


----------



## yogurt_21 (Oct 23, 2015)

sneekypeet has a major point here. Add that member to ignore and this thread looks pretty darn funny. I posted my peace before but I'll leave it with this. Unless AMD pulls out a rabbit there is nothing forcing Intel to increase performance per core or per clock. On the contrary the consumer base at large seems to be demanding smaller and more power efficient options rather than power hungry performance monsters. On the other side you have virtualization pushing for more cores, but not caring how well they perform against prior Intel setups. And no I'm not just talking enterprise, plenty of enthusiasts run vms at home, its far easier to manage than dual boot and you can have 10 or more vms to play around with on a single setup.  

I'm not even sure the pc gaming market is demanding more performance per clock, when was the last time you were unhappy with cpu performance? So I don't expect many new releases to be significantly faster than the prior leaders. You shouldn't either, there is no reason to. So unless  there's a feature you want that's only available on the new platform, there's no need upgrade. In this case there were several Z170 features that were intruiging just not enough for me to jump from a Z97 setup. Maybe the next gen.


----------



## EarthDog (Oct 23, 2015)

Zoinho said:


> Compare... a 4790k of the lying DigitalFoundry (youtu.be/4sx1kLGVAF0?t=1m2s) vs a Real benchmark of 4790k
> (youtu.be/S0uDzU8KTiQ?t=7m35s), and note that this Real result exceeds the 6700k OC.


Again, I posted up something from another review site that shows different... Did you look at that? What are your thoughts? As I also stated, there are several other reviews from reputable websites just like it. At least their testing methods are more transparent than some YT users.


----------



## Tomgang (Oct 23, 2015)

Funny you guys talking about the old I7 920 and performence. I am still using and old I7 920 DO cpu oc to 4,3 GHz together whit two GTX 970 in sli. And i will say for its age it performe very well in deed. sure a new I7 6700K would be nice and sure give a few more FPS in games, but to what price.

Back when i got my I7 920 i would never imaging i would had the same CPU for 6,5 year and still going strong. I have been thinking about getting a I7 970/980 or a 6 core xeon and oc the crap out of it to give the old X58 platform a longer life span.

If you guys will have some 3dmark firestrike scores of my old machine whit some overclock take a look below here.

single card run score 10911

http://www.3dmark.com/fs/5850771

two card run score 16384

http://www.3dmark.com/fs/5570488

Compare to a stock I7 4790K system whit two GTX 980 in sli score 17805

http://www.3dmark.com/fs/4937464

An I7 6700K oc to 4,6 GHz system whit two GTX 970 but where the GPU is clokket lower than my GPU´s but not by that much lower. whit 1288 MHz compared to my cards 1312 MHz. It score 17706

http://www.3dmark.com/fs/5926519

Even the 2015 game Evolve has the old I7 920 as a Recommended CPU. I mean a nealy 7 year old cpu as a Recommended cpu.

http://www.geforce.com/games-applications/pc-games/evolve/system-requirements

So what you thinking about this?


----------



## dorsetknob (Oct 23, 2015)

Tomgang said:


> I have been thinking about getting a I7 970/980 or a 6 core xeon and oc the crap out of it to give the old X58 platform a longer life span.



review the Xeon owners club thread for more info
here
http://www.techpowerup.com/forums/threads/xeon-owners-club.211143/
Overclocking can be impresive


----------



## Tomgang (Oct 23, 2015)

dorsetknob said:


> review the Xeon owners club thread for more info
> here
> http://www.techpowerup.com/forums/threads/xeon-owners-club.211143/
> Overclocking can be impresive



Oh my these old Xeon seem to be some nice overclock beast whit 6 cores and clocks around 4 to 4,5 GHz oc. This looks like to be a cheap and a great way to get more power whit out spending alot of money.
Very tempting i must say. Whit a xeon and some oc it looks like the old X58 platform can still be nice pc to work whit and play games on.


----------



## sneekypeet (Oct 23, 2015)

We are drifting off topic again, so I think it is about time we close up shop!


----------

