# Skylake iGPU Gets Performance Leap, Incremental Upgrade for CPU Performance



## btarunr (Jul 24, 2015)

With its 6th generation Core "Skylake" processors, Intel is throwing in everything it's got, into increasing performance of the integrated graphics. This is necessitated not by some newfound urge to compete with entry-discrete GPUs from NVIDIA or AMD, but a rather sudden increase in display resolutions, after nearly a decade of stagnation. Notebook and tablet designers are wanting to cram in higher resolution displays, such as WQHD (2560 x 1440), 4K (3840 x 2160), and beyond, and are finding it impossible to achieve them without discrete graphics. This is what Intel is likely after. The aftereffect of this effort would be that the iGPU will be finally capable of playing some games at 720p or 900p resolutions, with moderate eye-candy. Games such as League of Legends should be fairly playable, even at 1080p. Intel claims that its 9th generation integrated graphics will over a 50% performance increment over the previous generation.

Moving on to CPU, and the performance-increase is a predictable 10-20% single/multi-thread CPU performance, over "Broadwell." This is roughly similar to how "Haswell" bettered "Ivy Bridge," and how "Sandy Bridge" bettered "Lynnfield." Intel will provide platform support on some of its "Skylake-U" ultraportable variants, for much of the modern I/O used by today's tablets and notebooks, which required third-party controllers, and which competing semi-custom SoCs natively offer, such as eMMC 5.0, SDIO 3.0, SATA 6 Gb/s, PCIe gen 3.0, and USB 3.0. Communications are also improved, with 2x 802.11 ac, Bluetooth 4.1, and WiDi 6.0. 



 

 



*View at TechPowerUp Main Site*


----------



## ZoneDymo (Jul 24, 2015)

I kinda feel the first sentence should have been typed out like :
"With its 6th generation Core "Skylake" processors, Intel is throwing in everything it's got!......... into increasing performance of the integrated graphics.........."


----------



## RejZoR (Jul 24, 2015)

The 6700K CPU should be a hexacore. It's 2015 and they still consider a quadcore to be "enthusiast" level. C'mon, really!? I see ZERO point in switching and I have a Core i7 920. Only thing that I'd realistically gain is power consumption and some new instructions. Do your math how long can I use my existing i7 920 to justify the price difference in electricity bills...

From what I've checked, everything is identical. Cache sizes, core count, thread count etc. Hell, I even have triple channel on my ancient grunt and Skylake is only dual channel. Like ugh!? Totally pointless product. It only makes sense if you don't have a computer and you're buiying from scratch. Or you have some shitty dual core from 10 years ago...


----------



## Sakurai (Jul 24, 2015)

Is the integrated GPU comparable to any of NV's low-end chips?


----------



## RejZoR (Jul 24, 2015)

If they say games like League of Legends is "fairly" playable at 1080p that probably means they are comparing it to the lowest possible end of discrete graphic cards. AMD on the release conference for Radeon 300 series talked about LoL when they were hodling their lowest end R7 cards in their hands so do the math. And I think they were talking about 60fps iirc. I'm not so sure Intel is capable of 60fps. Not to mention quality levels. Intel always had shitty controls for FSAA and AF. Not that it can run many things with those enabled but sill, for older games it is useful...


----------



## jax (Jul 24, 2015)

Upgrade path from socket 1366 goes to socket 2011, extreme edition cpu. i7-6700k is considered mainstream, not enthusiast.


----------



## HumanSmoke (Jul 24, 2015)

Sakurai said:


> Is the integrated GPU comparable to any of NV's low-end chips?


The current (Broadwell) Iris Pro 6200 is at the same level as the GT 740...so technically yes, but barely.


----------



## techy1 (Jul 24, 2015)

And again - none cares about iGPU progress - I know that many people use only iGPU - BUT they are usually so uninformed (you can insert other synonyms for - being dumb) that they would not spot a difference between HD2000 and irisPRO anyway (both can run movies un fullHD, both can run angrybirds and browse pinterest or run office aps)... So why to bother and give them +30% better iGPU every generation and neglect CPU evolution???  Do you ever have heard or read something like this "omg, these new intel CPUs have sooo better iGPU - I need an upgrade now... Lets go to starbucks afterwards" ?


----------



## Space Lynx (Jul 24, 2015)

Why is Intel partnered with McAfee?

I see in the first image there is something built into the chip called McAfee YAP...  wth is this?  McAfee is a terrible company and software... looks like I am hanging on to my SB until 10nm is actually on shelves...


----------



## john_ (Jul 24, 2015)

lynx29 said:


> Why is Intel partnered with McAfee?
> 
> I see in the first image there is something built into the chip called McAfee YAP...  wth is this?  McAfee is a terrible company and software... looks like I am hanging on to my SB until 10nm is actually on shelves...



Maybe because Intel owns McAfee? Just a wild guess


----------



## SonicZap (Jul 24, 2015)

Am I the only one who is actually interested in these iGPU improvements? I mean, I'm currently satisfied with a Radeon HD 7850. With these +50% performance increases each generation, Intel is going to catch and exceed the performance of current mid-range graphics cards within a few years. Then I could just buy a Core i7 and get great CPU performance and good-enough graphics performance, instead of buying a Core i3/i5 and a €200 discrete GPU to go with it. TBH I've been waiting for AMD to do this "gaming APU", but it's looking like Intel has now beat AMD in integrated graphics with Skylake.

Of course I might still buy that discrete GPU unless Intel starts improving their drivers..


----------



## Space Lynx (Jul 24, 2015)

john_ said:


> Maybe because Intel owns McAfee? Just a wild guess



Hmm, hope AMD's next cpu can at least compete in games with DX12 active, rather not have anything McAfee in my PC thanks.  terrible antivirus company


----------



## john_ (Jul 24, 2015)

SonicZap said:


> Intel has now beat AMD in integrated graphics with Skylake


 That Skylake doesn't cost $140 like a A10 78X0K. It costs as much as 3 or 4 quad core APUs. Also AMD didn't had the money to improve Kaveri. If AMD had better economics we could already had a Carrizo with DDR4 support on an FM3 or something, probably with more advance GCN architecture than the one in Fury.
Anyway what you dream is what Nvidia fears, that's why they keep presenting, more and more hi end cards, instead of mid range. What you dream is what we will have in 1-2 years from now with 16nm and HBM.


----------



## Ubersonic (Jul 24, 2015)

RejZoR said:


> The 6700K CPU should be a hexacore. It's 2015 and they still consider a quadcore to be "enthusiast" level. C'mon, really!? I see ZERO point in switching and I have a Core i7 920. Only thing that I'd realistically gain is power consumption and some new instructions.



From an i7 920 to a i7 6700K you're looking at a ~67% performance increase clock for clock, that's hardly ZERO point.




RejZoR said:


> From what I've checked, everything is identical. Cache sizes, core count, thread count etc. Hell, I even have triple channel on my ancient grunt and Skylake is only dual channel. Like ugh!? Totally pointless product.



Dual channel DDR4 is roughly equal to triple/quad channel DDR3 in performance (depending on speed of DDR4) due to the higher performance per stick.


----------



## RejZoR (Jul 24, 2015)

Show me the difference in games between i7 920 at 4,2GHz and that Skylake. It'll probably be identical. Paying premium for 3 seconds less in 7zip compression, I couldn't care less...


----------



## john_ (Jul 24, 2015)

I think there is one more reason why Intel wants to improve its iGPU. DirectX 12 and asynchronous multi GPU support.

Today we have DirectX 11 and the iGPU is not used when gaming with a discrete GPU(dual graphics is not important). So, with DirectX11 Intel CPUs win easily against AMD APUs. The iGPU performance doesn't count here.

Tomorrow we will be playing DirectX 12 games. AMD will hopefully have a better architecture with Zen, but even without that, AMD offers more cores at the same prices. With the multithreaded performance that DirectX 12 offers, the difference between using an 4 core APU and a 2 core + HT i3 will be much smaller if any. With asynchronous multi GPU, the iGPU part of the APU will offer much higher help to the discrete GPU, probably making the APU + discrete GPU combination, performing much better than the i3 + discrete GPU combination.


----------



## SonicZap (Jul 24, 2015)

john_ said:


> That Skylake doesn't cost $140 like a A10 78X0K. It costs as much as 3 or 4 quad core APUs.


Yes, but if I'm looking for good performance and longevity, I would still rather pay €350 for a Skylake i7 than €140 for a A10 7870K. The difference in CPU performance is massive, and GPU wise Skylake is also going to be dozens of % better.

Also, at this pace, in 2017 Intel's iGPU will be 200% faster than AMD's, not only a few dozen percent. I'm not upgrading from my Phenom II and HD 7850 until 2016-2017 anyway.


----------



## john_ (Jul 24, 2015)

SonicZap said:


> Yes, but if I'm looking for good performance and longevity, I would still rather pay €350 for a Skylake i7 than €140 for a A10 7870K. The difference in CPU performance is massive, and GPU wise Skylake is also going to be dozens of % better.


You do get a much faster CPU. In fact that's what you pay. The much faster CPU. And probably when you are willing to pay almost 3 times the price, that's what interests you. The CPU, not the iGPU. Going from the 955 to a Skylake and keeping your 7850 is what you are probably going to do and then wait for Cannonlake or Zen and Greenland. You just have to wait a little longer.


----------



## Sakurai (Jul 24, 2015)

HumanSmoke said:


> The current (Broadwell) Iris Pro 6200 is at the same level as the GT 740...so technically yes, but barely.



Then if the next Iris embedded in 6600k is truly 50% faster than the current Iris 6200, technically you will own a CPU die with a GPU comparabe to a $200 GTX 960. And the 6600k MSRP is just north of $250.


----------



## john_ (Jul 24, 2015)

Nope, just about the R7 360 speed. But then, there are also the drivers. If people complain about AMD drivers, what is the case with Intel drivers?


----------



## rooivalk (Jul 24, 2015)

techy1 said:


> And again - none cares about iGPU progress - I know that many people use only iGPU - BUT they are usually so uninformed (you can insert other synonyms for - being dumb) that they would not spot a difference between HD2000 and irisPRO anyway (both can run movies un fullHD, both can run angrybirds and browse pinterest or run office aps)... So why to bother and give them +30% better iGPU every generation and neglect CPU evolution???  Do you ever have heard or read something like this "omg, these new intel CPUs have sooo better iGPU - I need an upgrade now... Lets go to starbucks afterwards" ?


It doesn't relate to you but it's relate to millions other.

My friend with old Athlon X2 (AMD 690 chipset?) can't play Dota 2 for example. Many Dota, LoL, CS (basically anything very popular) playerbase is still running basic PC/Notebook. Even if it's not for gaming, browser and Windows itself is accelerated using GPU. Good (i)GPU is a basic necessity now.


----------



## Sakurai (Jul 24, 2015)

I don't really know since I currently own a 3570k and the HD Graphics drivers have been stable so far. But since Skylake-S (the desktop ones) incorporates full hardware-level DX12 then pretty sure the iGPU will receive massive boost as well. The CPU gains might not be much but considering the Multiadapter feature, these Skylake chips should have tremendous value


----------



## GreiverBlade (Jul 24, 2015)

in a top end CPU i could care less for a IGP ...or any improvement in the domain, even for HTPC : why get a 350$ and more CPU when a 140$ APU could do basically the same? intel is trying to stay alone in the PC market and rule out nvidia and AMD? 

"they should focus the R&D on something else ... or buy nvidia ... i don't get the intel IGP improvement madness ... oh yes ... future is NUC (or so they would like it to be) we all gonna have a tiny box with a intel CPU and IGP, so it's better they improve their IGP for when they will take over the world  (i am totally not serious there ... )"

(ofc in the Laptop domain it's different ... tho, is a i7 2core +HT a top end CPU ... well in laptop it is ...  currently my i5-5200U + HD Graphics 5500 is enough ... no gaming notebook tho  and surely way more affordable than a future Skl Laptop)


----------



## RazrLeaf (Jul 24, 2015)

Why are so many people this mad at Intel? They're a business that exists to make returns on investments. I'm surprised that they're willing to have such consistent product improvement when they really have no competition in the high end CPU market. Sure, I'd be happier if they gave us +30% CPU performance, but they have a strong lead, and would like to secure their future since they're not fighting for the present. But Intel sees that they're weak in their iGPU side of the business, and that's why their focus is there. It's all about making money at the end of the day, for any business that has to answer to investors.

TL;DR Start rooting for AMD if you want product improvements from Intel that aren't "pointless."


----------



## SonicZap (Jul 24, 2015)

RazrLeaf said:


> Why are so many people this mad at Intel?


A big part of that is because Intel partially reached their current position with anticompetitive practices. AMD _could_ (no way to know for sure, since they've mostly screwed themselves with bad decisions one after the other) be in a better shape today if they had gained the strong lead they deserved with the Athlon 64s, but Intel prevented that by forcing OEMs to delay AMD product launches.


----------



## 64K (Jul 24, 2015)

I will wait for reviews to decide but I'm really wanting to build a new rig. Depending on Skylake vs Ivy Bridge (my present CPU) I may wait for Cannonlake.



john_ said:


> I think there is one more reason why Intel wants to improve its iGPU. DirectX 12 and asynchronous multi GPU support.
> 
> Today we have DirectX 11 and the iGPU is not used when gaming with a discrete GPU(dual graphics is not important). So, with DirectX11 Intel CPUs win easily against AMD APUs. The iGPU performance doesn't count here.
> 
> Tomorrow we will be playing DirectX 12 games. AMD will hopefully have a better architecture with Zen, but even without that, *AMD offers more cores at the same prices*. With the multithreaded performance that DirectX 12 offers, the difference between using an 4 core APU and a 2 core + HT i3 will be much smaller if any. With asynchronous multi GPU, the iGPU part of the APU will offer much higher help to the discrete GPU, probably making the APU + discrete GPU combination, performing much better than the i3 + discrete GPU combination.



AMD is also loosing hundreds and hundreds of millions of dollars yearly doing things like that and are now on the verge of bankruptcy. I guess having owned my own business I think differently than most but one of the things that AMD fans love about AMD (cheap prices) is one of the things that has brought AMD to ruin over the years. When I owned my business I charged what everyone else was charging and usually my profit was high, occasionally ridiculously high but I was fine with that. If they didn't do business with me then they would pay the same elsewhere. I provided the best service that I could and beat out my competitors pretty well.


----------



## john_ (Jul 24, 2015)

RazrLeaf said:


> TL;DR Start rooting for AMD if you want product improvements from Intel that aren't "pointless."


Maybe you should start shouting at Intel to cut the bulling on OEMs and the contra revenue program to give AMD a chance to breath. Then maybe AMD will be also in a position to finance R&D and improve it's products. After that you might want to turn at Nvidia and ask them to stop their practices with the proprietary stuff like GameWorks and PhysX. If they accept your complains, then maybe AMD will become competitive and you will be able to support competition in the PC market by buying.... more Intel and Nvidia hardware


----------



## john_ (Jul 24, 2015)

64K said:


> AMD is also loosing hundreds and hundreds of millions of dollars yearly doing things like that and are now on the verge of bankruptcy. I guess having owned my own business I think differently than most but one of the things that AMD fans love about AMD (cheap prices) is one of the things that has brought AMD to ruin over the years. When I owned my business I charged what everyone else was charging and usually my profit was high, occasionally ridiculously high but I was fine with that. If they didn't do business with me then they would pay the same elsewhere. I provided the best service that I could and beat out my competitors pretty well.


 Cheap prices where a necessity for AMD most of the time. Even when they had faster processors, Intel was controlling the OEMs. Every OEM, or retail store was selling Pentium 4s. After that the Phenom processor wasn't that fast and the Bulldozer architecture a pure disaster. So how can you go out and charge equally or more? The competition is controlling the OEMs, the retail stores, the press. People are used in blaming AMD for the same things they will find plenty of excuses for Nvidia or Intel. When everything is against you, can you really expect to empty your warehouses with prices that are not ultra competitive? Fury X come out at the same price as 980Ti and guess what. Everyone was looking the second decimal on the fps counter to say that the card was a failure. Suddenly a pump noise was compared to a jet engine and tech sites rush to write articles about how AMD failed there. And they rushed because it was already known that the problem was fixed. When everyone is pointing a gun at you, can you really charge extra?


----------



## Easo (Jul 24, 2015)

Prima.Vera said:


> WiDi 6.0 ??



Wireless Display.


----------



## jabbadap (Jul 24, 2015)

lynx29 said:


> Hmm, hope AMD's next cpu can at least compete in games with DX12 active, rather not have anything McAfee in my PC thanks.  terrible antivirus company



Heh, speaking of it:


Spoiler: OT


----------



## buggalugs (Jul 24, 2015)

RejZoR said:


> The 6700K CPU should be a hexacore. It's 2015 and they still consider a quadcore to be "enthusiast" level. C'mon, really!? I see ZERO point in switching and I have a Core i7 920. Only thing that I'd realistically gain is power consumption and some new instructions. Do your math how long can I use my existing i7 920 to justify the price difference in electricity bills...
> 
> From what I've checked, everything is identical. Cache sizes, core count, thread count etc. Hell, I even have triple channel on my ancient grunt and Skylake is only dual channel. Like ugh!? Totally pointless product. It only makes sense if you don't have a computer and you're buiying from scratch. Or you have some shitty dual core from 10 years ago...



 Dude, you're ancient, Skylake, or even haswell has 100% better single threaded performance than a 920, and more than 100% better multi threaded performance.  You're looking at least 50% faster in games.

http://www.anandtech.com/show/7003/the-haswell-review-intel-core-i74770k-i54560k-tested/7

  Add to that new instructions and  IPC improvements are huge compared to a 920, plus all the new features on modern boards USB 3 , PCI-E 3, wifi, bluetooth, better audio and networking, etc  Everything is much faster, moving data, rendering video, running scans, boot times, ,everything is noticeably faster.. and it uses half as much power.

 You cant compare cores and cache and say its the same CPU...

 Sure if you only surf the web and play games from 2002, or GPU limited games you might not notice much but I had a 920 and its a dinosaur compared to haswell or skylake.

 ........and this platform isnt enthusiast, its mainstream X58/ 920 was enthusiast


----------



## Naito (Jul 24, 2015)

buggalugs said:


> ........and this platform isnt enthusiast, its mainstream X58/ 920 was enthusiast



This. Besides, it'd be worth the upgrade just for the native chipset features. Most peeps I know ditched their Nehalems years ago for Sandy or Ivy Bridges.


----------



## Ferrum Master (Jul 24, 2015)

An update to a thing I do not use... an iGPU...

SCREW it... Sandy Bridge FTW...


----------



## NeDix! (Jul 24, 2015)

RejZoR said:


> Show me the difference in games between i7 920 at 4,2GHz and that Skylake. It'll probably be identical. Paying premium for 3 seconds less in 7zip compression, I couldn't care less...



You really know how to buy dude, theres no point on move to skylake on any i7 user, theres a point on move to 5820k, but 2011v3 is so expensive D:, but in the future may be worth get 2011v3 now, if u can get a 8/16 used cheap later :v


----------



## MxPhenom 216 (Jul 24, 2015)

RejZoR said:


> The 6700K CPU should be a hexacore. It's 2015 and they still consider a quadcore to be "enthusiast" level. C'mon, really!? I see ZERO point in switching and I have a Core i7 920. Only thing that I'd realistically gain is power consumption and some new instructions. Do your math how long can I use my existing i7 920 to justify the price difference in electricity bills...
> 
> From what I've checked, everything is identical. Cache sizes, core count, thread count etc. Hell, I even have triple channel on my ancient grunt and Skylake is only dual channel. Like ugh!? Totally pointless product. It only makes sense if you don't have a computer and you're buiying from scratch. Or you have some shitty dual core from 10 years ago...



Lga1151 is still the mainstream platform.


----------



## GhostRyder (Jul 24, 2015)

The iGPU does have a place though mostly on the mobile market where getting GPU add in's cost quite a lot so its good for that area.  But on the desktop its a minimal thing especially on some of these expensive chips...

I will be holding my 5930K for at least 3+ years.


----------



## Petey Plane (Jul 24, 2015)

techy1 said:


> And again - none cares about iGPU progress - I know that many people use only iGPU - BUT they are usually so uninformed (you can insert other synonyms for - being dumb) that they would not spot a difference between HD2000 and irisPRO anyway (both can run movies un fullHD, both can run angrybirds and browse pinterest or run office aps)... So why to bother and give them +30% better iGPU every generation and neglect CPU evolution???  Do you ever have heard or read something like this "omg, these new intel CPUs have sooo better iGPU - I need an upgrade now... Lets go to starbucks afterwards" ?



On every annual Steam hardware survey, Intel iGPU users are the majority of the Steam user base, something like consistent 60% range (edit: actually about 20%, but i think the point still stands as to who Intel's primary market is, and it's not enthusiasts), and those are just the users that actually game.  The vast majority of PC users are on laptops and don't play games.  Those customer are much more important to Intel than gamers, and make up significantly more of their customer base.  Pushing web content at 4K can be very taxing, so having a strong iGPU is important.  CPUs are already overpowered for those tasks, so there is little incentive for Intel to spend resources there.  The iGPU is much more important to Intel's primary customers, OEMs.

It's easy to be myopic, as an enthusiast, but the reality is, enthusiasts are a pretty insignificant, and not particularly profitable market.  If Intel ever had to pick between selling to Newegg, or selling to Lenovo, i think the choice would be pretty easy for them.


----------



## Ferrum Master (Jul 24, 2015)

Petey Plane said:


> On every annual Steam hardware survey, Intel iGPU users are the majority of the Steam user base, something like consistent 60% range, and those are just the users that actually game.
> It's easy to be myopic, as an enthusiast, but the reality is, enthusiasts are a pretty insignificant, and not particularly profitable market.  If Intel ever had to pick between selling to Newegg, or selling to Lenovo, i think the choice would be pretty easy for them.



I can also those are also the users that don't pay! Take your facts a bit from the realistic point of view. So around 4mil peps actually use the igpu? Are you kidding? From the whole CPU batch?


----------



## Phobia9651 (Jul 24, 2015)

Petey Plane said:


> On every annual Steam hardware survey, Intel iGPU users are the majority of the Steam user base, something like consistent 60% range, and those are just the users that actually game.



I reckon that it is just people that have an Intel iGPU in their system (basically all the SNB, IB, Haswell and Broadwell users). Noone I know games on an iGPU.

Edit: A quick look at the actual data tells you that: Nvidia 52%, AMD 27.72%, Intel 19.88% and Other 0.4%
So not sure where you got your "consistent 60%" from. Besides now that I see the actual data, I assume that these are mostly laptop users that use an Intel iGPU.


----------



## Petey Plane (Jul 24, 2015)

Ferrum Master said:


> I can also those are also the users that don't pay! Take your facts a bit from the realistic point of view. So around 4mil peps actually use the igpu? Are you kidding? From the whole CPU batch?





urza26 said:


> I reckon that it is just people that have an Intel iGPU in their system (basically all the SNB, IB, Haswell and Broadwell users). Noone I know games on an iGPU.



Ok, I've been wrong.  The current hardware survey shows Intel at about 20% of the Steam GPU user base.  So fair enough, it's still a large percentage.  And nearly 50% of users only have a 2-core CPU.  All that being said, we should still remember that OEM laptop manufacturers like HP, Apple and Lenovo are Intel's biggest market, and why they focus so much on iGPU performance.


----------



## Ferrum Master (Jul 24, 2015)

Petey Plane said:


> Ok, I've been wrong.  The current hardware survey shows Intel at about 20% of the Steam GPU user base.  So fair enough, it's still a large percentage.  And nearly 50% of users only have a 2-core CPU.  All that being said, we should still remember that OEM laptop manufacturers like HP, Apple and Lenovo are Intel's biggest market, and why they focus so much on iGPU performance.



Mate... that user count from the percent is less than GTA5 PC copies sold... that actually doesn't run on a iGPU still...


----------



## Petey Plane (Jul 24, 2015)

Ferrum Master said:


> Mate... that user count from the percent is less than GTA5 PC copies sold... that actually doesn't run on a iGPU still...



?  I'm not sure i understand your comment.  That number is from the Steam Hardware & Software Survey, Jan. 2014 to June 2015.  Not sure what that has to do with the number of GTA5 copies sold.  The point still stands i think, where AMD only has about 8% more user base than Intel.  

The survey does have a selection bias, in that if you are buying a discreet GPU, than you are also going to be very likely using Steam, and vice versa.  But that also means that non-Steam users are much more likely to be using iGPUs.


----------



## Frick (Jul 24, 2015)

I still want a Celeron/Pentium/i3 cpu with Iris/Iris Pro in it.


----------



## Ferrum Master (Jul 24, 2015)

Petey Plane said:


> ?  I'm not sure i understand your comment.  That number is from the Steam Hardware & Software Survey, Jan. 2014 to June 2015.  Not sure what that has to do with the number of GTA5 copies sold.  The point still stands i think, where AMD only has about 8% more user base than Intel.
> 
> The survey does have a selection bias, in that if you are buying a discreet GPU, than you are also going to be very likely using Steam, and vice versa.  But that also means that non-Steam users are much more likely to be using iGPUs.



You still don't get the idea that igpu gamer count is like drop in the bucket for Intel CPU batch size and overall the provided numbers that you showed using steam... The GTA5 data shows that only one title accumulates more dedicated GPU owners that the whole iGPU user base for Steam at all?

Don't mix up AMD here... and exactly... non steam users, those who user for business activities... the real other batch of CPU's that  don't care about Iris really? The thing is useless to begin with... such amount of transistors... there are over 100mil desktop shipments for Intel... their presence numbers cannot be compared apples to apples to AMD presence... The whole majority of the users don't care about the iGPU as long they have something that outputs and image to monitor and shows kitten videos.


----------



## RazrLeaf (Jul 24, 2015)

SonicZap said:


> A big part of that is because Intel partially reached their current position with anticompetitive practices. AMD _could_ (no way to know for sure, since they've mostly screwed themselves with bad decisions one after the other) be in a better shape today if they had gained the strong lead they deserved with the Athlon 64s, but Intel prevented that by forcing OEMs to delay AMD product launches.


I'd honestly forgotten about that (incident occurred 12 years ago, fine levied 6 years ago). 



john_ said:


> Maybe you should start shouting at Intel to cut the bulling on OEMs and the contra revenue program to give AMD a chance to breath. Then maybe AMD will be also in a position to finance R&D and improve it's products. After that you might want to turn at Nvidia and ask them to stop their practices with the proprietary stuff like GameWorks and PhysX. If they accept your complains, then maybe AMD will become competitive and you will be able to support competition in the PC market by buying.... more Intel and Nvidia hardware


And I don't want to shout at Intel. Or Nvidia. If they develop something, they are perfectly within their rights to keep it proprietary. They're for-profit companies after all. Though it'll be interesting to see what happens if AMD actually goes the way of the dodo...


----------



## tabascosauz (Jul 24, 2015)

Bulldozer was 100% AMD's problem, and 100% AMD's downfall. Through the Pentium 4-esque pipeline problem, supremely slow cache and the attempt to stay 'upgradable' by not moving to FCH, AMD still tried to distort the facts and try to make it sound as if Bulldozer was better than SB with its 8 'cores'. When you are fighting an uphill battle with the X6 1100T being all that you have to offer, something like Bulldozer is probably going to hurt you more than staying conventional. Intel had a backup plan for Prescott in the Pentium III that eventually ended up as Yonah and Core 2. AMD had nothing.

You could say that AMD had no resources to devote to making a backup plan because Intel had bribed the OEMs those years ago. Whatever the excuse might be, the burden of Bulldozer rested squarely on an already weak AMD's shoulders, while Intel had a nice surprise with Sandy Bridge supremacy.

In the corporate world, exactly how much space is devoted to conscience and morality? When Intel was fumbling with the monstrosity that was Prescott, what did you expect them to do? Pull an AMD and place all their hopes on Prescott?


----------



## john_ (Jul 24, 2015)

RazrLeaf said:


> Though it'll be interesting to see what happens if AMD actually goes the way of the dodo...


Well that's what people said in Greece a few months ago. "We already saw the old political parties. Let's try something new. Let's vote for a left government. What could possibly go wrong? It can't become worst than what it is already!"  


*tabascosauz*

They could try to shrink Thuban at 32nm and later add to that chip the good stuff from Bulldozer. I think Intel is doing exactly that with Broadwell. I think they made Broadwell in case Skylake was having problems of any kind.


----------



## zithe (Jul 24, 2015)

More iGPU speed might be helpful for users of the Cintiq Companion or similar level tablets for painting.


----------



## R-T-B (Jul 24, 2015)

buggalugs said:


> Dude, you're ancient, Skylake, or even haswell has 100% better single threaded performance than a 920, and more than 100% better multi threaded performance.  You're looking at least 50% faster in games.



Uh, 20% actually in IPC.  Many Nehalems will clock past a Haswell as well.  Put 2 and 2 together.


----------



## Petey Plane (Jul 24, 2015)

Ferrum Master said:


> The whole majority of the users don't care about the iGPU as long they have something that outputs and image to monitor and shows kitten videos.



Ok, i get what you're saying, in that users who buy Intel specifically for the iGPU is tiny, which is true.  The issue is, is that the iGPU does matter when 1440 and 4k become the standard laptop resolutions, and watching 4k kitten videos (along with netflix, hulu, etc...) will require a powerful iGPU.  A lot of other web content is GPU dependent as well.  And someone has to make those kitten videos, so editing and transcoding are sped up significantly if you have a more powerful iGPU.  There are also plenty of business presentation software that benefits from having a strong GPU.

The other reality is, from a 2600k to a 6700k, both using the same discreet video card, there would only be about a 10% increase in FPS in most games, at 1440 or 4K.  But if i can get that 10% increase, with a 30% decrease in power usage, then that sounds good to me.  A lot of the users here already have overpowered CPUs for what they mainly do, gaming.

And sorry if i keep repeating myself, but gamers and enthusiasts are pretty close to the bottom of Intel's priority list when it comes to product development.  They throw us bones with the K and X CPUs, but compare the number of chips that have those suffix to the number that don't.  The K and X parts aren't the revenue generators for Intel.


----------



## Brother Drake (Jul 24, 2015)

Though I understand why Intel has put so much into onboard graphics with smartphones, laptops and office workstations making up such a large part of the market, I wish that they would make a line of CPU's for people who use discreet graphics cards as well OR(!) work with the graphics card makers to create an architecture that shares graphics workload between the CPU and the external graphics card when one is available. That may be too much to ask but why not make a line of CPUs that are just CPUs? I feel that Intel is kind of ignoring the (very large) gamer market.


----------



## Petey Plane (Jul 24, 2015)

Brother Drake said:


> Though I understand why Intel has put so much into onboard graphics with smartphones, laptops and office workstations making up such a large part of the market, I wish that they would make a line of CPU's for people who use discreet graphics cards as well OR(!) work with the graphics card makers to create an architecture that shares graphics workload between the CPU and the external graphics card when one is available. That may be too much to ask but why not make a line of CPUs that are just CPUs? I feel that Intel is kind of ignoring the (very large) gamer market.



the i7 Extreme X and K chips are for that.  Also, the economies of scale are why they don't make 1151 socketed chips without iGPUs, since they all share the same basic architecture.  It doesn't make a lot of sense for Intel to fab a separate chip architecture without an iGPU, when it can just as easily build 1 fab, bin the higher performance chips, and leave it up to the end user to determine if they want to use the iGPU or not.  If they were to build a line of 1151 CPUs without iGPUs, they would cost significantly more, since the economies of scale would not benefit the much smaller production line. 

As far as sharing resources, that is really a driver issue, and one that Microsoft says will be addressed with DX12.


----------



## RejZoR (Jul 24, 2015)

Ferrum Master said:


> I can also those are also the users that don't pay! Take your facts a bit from the realistic point of view. So around 4mil peps actually use the igpu? Are you kidding? From the whole CPU batch?



Question is, how many of those users are actually using discrete graphic cards. You can't claim market share if GPU is attached to every CPU whether you use it or not.


----------



## Petey Plane (Jul 24, 2015)

RejZoR said:


> Question is, how many of those users are actually using discrete graphic cards. You can't claim market share if GPU is attached to every CPU whether you use it or not.



The Steam Hardware survey records your primary display device, so even if your have an iGPU, it will record the discreet GPU that is outputting video.  I think the 20% number is accurate.  I'm sure there are a lot of people in less developed areas of the world playing LoL, DOTA and CS 1.6 at 720 with the settings turned down, easily doable with even an older Intel iGPU.


----------



## tabascosauz (Jul 24, 2015)

Brother Drake said:


> Though I understand why Intel has put so much into onboard graphics with smartphones, laptops and office workstations making up such a large part of the market, I wish that they would make a line of CPU's for people who use discreet graphics cards as well OR(!) work with the graphics card makers to create an architecture that shares graphics workload between the CPU and the external graphics card when one is available. That may be too much to ask but why not make a line of CPUs that are just CPUs? I feel that Intel is kind of ignoring the (very large) gamer market.



The problem with this is that making a line of CPUs without iGPUs requires a lot of resources, because the current 2C / 4C dies for Haswell don't just go into one product. And yes, the i3s and Pentiums shouldn't be handicapped 4C dies; they are reportedly smaller in size and with just 2 cores. Very large gamer market? Are they ignoring us gamers? We still use their CPUs, and even the lack of overclockability in Haswell isn't a huge problem for gamers.

And you gotta keep in mind that Intel High End Desktop platforms are made just for that, and a large part of the "gamer market" doesn't even game on desktops, let alone gaming desktops or custom built desktops.



Petey Plane said:


> the i7 Extreme X and K chips are for that.  Also, the economies of scale are why they don't make 1151 socketed chips without iGPUs, since they all share the same basic architecture.  It doesn't make a lot of sense for Intel to fab a separate chip architecture without an iGPU, when it can just as easily build 1 fab, bin the higher performance chips, and leave it up to the end user to determine if they want to use the iGPU or not.  If they were to build a line of 1151 CPUs without iGPUs, they would cost significantly more, since the economies of scale would not benefit the much smaller production line.
> 
> As far as sharing resources, that is really a driver issue, and one that Microsoft says will be addressed with DX12.



However, it IS viable to have a "gamer" line of CPUs with iGPU disabled, without having to spend as much resources as a new CPU die would need. Intel has proved this for 3 generations so far with E3-12x0 SKUs that have their iGPUs disabled. Although Intel clearly doesn't think that gaming is very important because as of E3 v4 the trend has been broken and the E3s were never marketed for gaming anyways. If Intel offer a similarly handicapped unlocked SKU in the future, it could be interesting, though this would have a separate, potentially detrimental impact on its existing i5-xxxxK and i7-xxxxK SKUs.


----------



## ensabrenoir (Jul 24, 2015)

Enthusiast........sometimes we just don't get it.  The p.c. landscape has changed.  Intel knows what its doing and they are doing it at an insanely fast rate......themselves.......would have sworn that they would've had to have  bought someone to do it.


----------



## Petey Plane (Jul 24, 2015)

tabascosauz said:


> The problem with this is that making a line of CPUs without iGPUs requires a lot of resources, because the current 2C / 4C dies for Haswell don't just go into one product. And yes, the i3s and Pentiums shouldn't be handicapped 4C dies; they are reportedly smaller in size and with just 2 cores. Very large gamer market? Are they ignoring us gamers? We still use their CPUs, and even the lack of overclockability in Haswell isn't a huge problem for gamers.
> 
> And you gotta keep in mind that Intel High End Desktop platforms are made just for that, and a large part of the "gamer market" doesn't even game on desktops, let alone gaming desktops or custom built desktops.
> 
> ...



I get what you're saying, but knowing modern marketing, they'd probably charge more for the chip with the iGPU disabled.


----------



## Brother Drake (Jul 25, 2015)

If Intel is pushing the graphics architecture then it stands to reason that some processors will fail testing for that part of the processor. Those processors can have the graphics disabled and sold as a 'P' series as Intel did with Sandy Bridge. They (should) cost less and use less power than fully functional processors. I also think Intel needs to remember that the customers that have always driven the advancement of top end CPUs are gamers and graphics specialists and that market is the most profitable on a price per unit basis.


----------



## alwayssts (Jul 25, 2015)

Ferrum Master said:


> An update to a thing I do not use... an iGPU...
> 
> SCREW it... Sandy Bridge FTW...




While I hear ya on the IGP, and people crapping on enthusiast improvements in Skylake in general...I dunno, man.  Even though people seem to agree it's generally not worth it at this point for discrete gpu gaming and/or general tasks...I really want off of this old boat.  It's not broken, but the wood paneling is starting to fade.

I want M.2/Sata express capability.  I want to actually be able to RAID my MX100's (remember many of us Sandy users can't use Raid0 + Trim, at least without a hell of a lot of trouble).  Hell, I want pci-e 3.0.

I also really would like added tangible performance for video decoding stuffs, etc (50%?).

Many people seem to take the approach that upgrading from a 4c to a 4c will long be a fool's errand, and the only way to go is up (ie to a 5820k)....which is fair....but I, for one, think Skylake is going to get there just as well.  It may be through IPC rather than cores, and overclocked vs overclocked (I think at stock the 5820k will justify it's ~10% higher price) , but that's okay.

While I'm not quite on board with the NUC future, I am on board with mini-itx/microatx present.  Just like with Sandy Bridge, I could see building a fairly small performance box out of Skylake; perhaps one that can more-or-less rival a similar-price build on x99.  Whenever I think of simply building the latter now, I feel compelled to check the BTU rating of my air conditioner...I just don't think it would work (for me).

IOW, it may be a side-step for us enthusiasts, sure....but it's a side-step when coupled with all the improvements over the past few years finally pushes it over the edge, imho.

Do I fault anyone that goes 5820k+?  No.

How about waiting for Zen (which may compete or be slightly better over-all while perhaps a better price for an 8-core part)?  What about Skylake-E, which actually will probably be the big step forward?

All of those are respectable opinions.

To me though, I need a new platform, preferably the best combo of longevity/newest features, with a worth-while upgrade on the cpu which preferably won't use a ridiculous amount of power overclocked.

I think that's Skylake.


----------



## tabascosauz (Jul 25, 2015)

Plus, HD Graphics has its uses. Quick Sync is a nice thing to have when recording games, and when you need to test the odd thing or two sans graphics card, Intel's graphics will do nicely.

Very few of us are extreme overclockers and the presence of an iGPU is not the end of the world. If Intel sat back and kept HD 2500 performance for 5 generations, whiners would complain about how they're just making money off helpless consumers and not doing anything for them. If Intel is improving iGPUs as it is now, the same whiners complain about how Intel GPUs suck and how the Future is Fusion.


----------



## RejZoR (Jul 25, 2015)

Skylake-E might be interesting. But that part is coming out sometime 2016 which is still faaaaar away...


----------



## GreiverBlade (Jul 25, 2015)

tabascosauz said:


> Plus, HD Graphics has its uses. Quick Sync is a nice thing to have when recording games, and when you need to test the odd thing or two sans graphics card, Intel's graphics will do nicely.


Shadowplay? DVR? ... used both (ok shadowplay is better ...) but HD graphic has it's use only if he's alone ... which is rarely the case in a i5-i7 machine (unless laptop but) and you don't game so often on a laptop or HD graphics (well i would not spit on a beffier HD graphics indeed ... but for mobile CPU ... eventually HTPC tho AMD is still a better option~  i am quite happy about the HD Graphics 5500 vs the HD Graphics 4400 on my laptop tho )

as for testing things when you main gpu go kia/mia ... well i always have a spare GT730/HD5450 but for a mITX or even Micro ATX indeed it's usefull (HTPC again ... or lanbox at most)


----------



## RichF (Jul 25, 2015)

tabascosauz said:


> Bulldozer was 100% AMD's problem, and 100% AMD's downfall. Through the Pentium 4-esque pipeline problem, supremely slow cache and the attempt to stay 'upgradable' by not moving to FCH, AMD still tried to distort the facts and try to make it sound as if Bulldozer was better than SB with its 8 'cores'. When you are fighting an uphill battle with the X6 1100T being all that you have to offer, something like Bulldozer is probably going to hurt you more than staying conventional. Intel had a backup plan for Prescott in the Pentium III that eventually ended up as Yonah and Core 2. AMD had nothing.
> 
> You could say that AMD had no resources to devote to making a backup plan because Intel had bribed the OEMs those years ago. Whatever the excuse might be, the burden of Bulldozer rested squarely on an already weak AMD's shoulders, while Intel had a nice surprise with Sandy Bridge supremacy.
> 
> In the corporate world, exactly how much space is devoted to conscience and morality? When Intel was fumbling with the monstrosity that was Prescott, what did you expect them to do? Pull an AMD and place all their hopes on Prescott?


The FX chips are not that bad. If they were on 22nm they could be clocked even higher. The main problem is that they were released before the rest of the tech was ready for them (Windows thread management, DX12, etc.) and now that the tech has almost caught up they're still stuck on 32nm.

CMT is a sound strategy, but not when DX11 and Windows are poor at multithreading.


----------



## Caring1 (Jul 25, 2015)

Brother Drake said:


> why not make a line of CPUs that are just CPUs? I feel that Intel is kind of ignoring the (very large) gamer market.


They do, they're called XEON!


----------



## RejZoR (Jul 25, 2015)

Xeons are for workstations, not gaming rigs. Unless you're really loaded in which case sure, go with a Xeon...


----------



## Nordic (Jul 25, 2015)

RejZoR said:


> Question is, how many of those users are actually using discrete graphic cards. You can't claim market share if GPU is attached to every CPU whether you use it or not.


Looks like about 20% use intel gpu's for gaming.
http://store.steampowered.com/hwsurvey



RejZoR said:


> Xeons are for workstations, not gaming rigs. Unless you're really loaded in which case sure, go with a Xeon...


Actually, xeons are pretty cheap. You can get the equivalent of an i7 for the price of an i5 with xeons. That is if you don't mind not being able to overclock.


----------



## RejZoR (Jul 25, 2015)

I don't think there is any need. With my i7 920 I can feel in games like Killing Floor 2 that at stock clocks of 2,66 GHz it lacks some grunt. Overclocked to just 3,2 GHz it already feels like million times smoother gameplay even in later waves with tons of enemies. at 4 GHz and 4,2 GHz, it's silky smooth no matter what.

So, theoretically, if Skylake is lets say 50% more efficient per MHz and is factory clocked at 4,2 GHz, it should be fine even at stock speeds for a very long time.

I wonder if Skylake-E will feature more cores and what will they use eDRAM for. The CPU part or only as aid for iGPU. I'm kinda aiming at that for year 2016. If it'll be able to provide me with all the compute power for 6 more years like i7 920 did, that would be amazing.


----------



## LAN_deRf_HA (Jul 25, 2015)

All the 920 champions have quite the rose tinted view of the first gen i7s. They don't clock well, ran hot and hungry when pushed past 4ghz, and weren't any faster than yorkfield for gaming. They were stomped on as soon as Sandy showed up. I don't know of anyone who jumped on Sandy from Nehalem and regretted it. And now it's been replaced 6 times over, each time with a performance boost, and you people are still bemoaning the lack of advancement? Get over it. These mental gymnastics used to justify keeping a cpu from 2008 are absurd.


----------



## RejZoR (Jul 25, 2015)

They don't clock well?! From 2,66 GHz to 4,2 GHz is "not clock well" !? Your mental gymnastics are broken...

If I'd give you one system with overclocked i7 920 and one with Skylake and you wouldn't be able to tell a difference. I don't give a shit about synthetic benchmarks and few examples where you actually notice it. I'm often encoding videos and compressing data with 7zip and this thing is hammering workloads like a champ with 8 threads. And with games, maybe you gain 2fps using latest CPU. Totally worth replacing entire platform and spending 1k € on it... by your logic.


----------



## hat (Jul 26, 2015)

Have to agree with RejZoR. 2.66 > 4 not clocking well? That's pretty damn well if you ask me. Haswell clocks like shit compared to the origional i7. I'm sure the lower end chips could clock well, IF we were allowed...


----------



## Thefumigator (Jul 26, 2015)

RejZoR said:


> They don't clock well?! From 2,66 GHz to 4,2 GHz is "not clock well" !? Your mental gymnastics are broken...
> 
> If I'd give you one system with overclocked i7 920 and one with Skylake and you wouldn't be able to tell a difference. I don't give a shit about synthetic benchmarks and few examples where you actually notice it. I'm often encoding videos and compressing data with 7zip and this thing is hammering workloads like a champ with 8 threads. And with games, maybe you gain 2fps using latest CPU. Totally worth replacing entire platform and spending 1k € on it... by your logic.



The i7 920 is one of the nicest CPUs around, however, it heats up quite a lot. 
In my case, I own a less powerful CPU, FX 8320 since 2012, and I still can't think on upgrading, the thing is a beast.


----------



## RejZoR (Jul 26, 2015)

It's not that problematic to be honest. I had it in TT Lanbox, cooled by Thermalrights low profile cooler, crammed below PSU. Granted, I couldn't OC it to 4,2GHz that way but it was manageable at 3,2GHz. With AiO, it feels superb even at lowest fan RPM, it's what I use for ALL workloads. And lets don't forget, it's the exact same configuration as Skylake (same cache size, same number of cores etc) just made with a lot larger process node. It's 6 years old after all... It's normal that it's "hot" compared to current processors.


----------



## R-T-B (Jul 26, 2015)

Caring1 said:


> They do, they're called XEON!



More like HEDT



LAN_deRf_HA said:


> All the 920 champions have quite the rose tinted view of the first gen i7s. They don't clock well, ran hot and hungry when pushed past 4ghz, and weren't any faster than yorkfield for gaming. They were stomped on as soon as Sandy showed up. I don't know of anyone who jumped on Sandy from Nehalem and regretted it. And now it's been replaced 6 times over, each time with a performance boost, and you people are still bemoaning the lack of advancement? Get over it. These mental gymnastics used to justify keeping a cpu from 2008 are absurd.



I bought a Haswell-E from Nehalem and regretted it.  Start counting man.  Nehalem also clocked insanely well...  Haswell-E and it's entire 20% (from nehalem) IPC increase?  Look at my overclock.  Do the math.  Not worth it.


----------



## 1Kurgan1 (Jul 26, 2015)

I'm happy I bought my 5820k instead of waiting for these after seeing news like this.


----------



## GreiverBlade (Jul 26, 2015)

R-T-B said:


> I bought a Haswell-E from Nehalem and regretted it.  Start counting man.  Nehalem also clocked insanely well...  Haswell-E and it's entire 20% (from nehalem) IPC increase?  Look at my overclock.  Do the math.  Not worth it.


i am happy i have a 4690K instead of the 920 i had ...and 16gb 2400 instead of 12gb 1600 (ok from 3.5 to 4.5 is not same as 2.66 to 4.0 ... after all 340mhz is a huge gap and HT is a totally usefull thing... and triple channel give a HUGE edge over double in day to day use) and did cost a bit less than initial investment, (i suspect it's the cost of the 2011-v3 who make you feel that ... while with a 4790K you could have a different feeling for a fraction of the cost involved)
face it 1366  actual is 1150/1151 not 2011/2011-v3 (unless you have a 990X ) you have the right to be nostalgic but being realist is more important.

just in case ... even at 20%(unless it's Haswell-E only improvement) more ipc  a 4790K is base clock at 4.0 if my 920 did effectively go from 2.66 to 4.0 a 4790K do that at stock and can clock higher and would need less power ... technically my 4690K is faster than my 920 at their respective OC and, too, need less power (NO HT INVOLVED THANKS, keep it realistic in my domain ... i dont edit video use heavy threaded soft, just gaming and day to day browsing, some bench sometime) ... end words : better and worth the change (plus as i said in another thread, reselling my 1366 setup got me 30% of the 1150 setup investment back but i didn't took a 2011-v3  )

well for the IGP ... i am happy i have a igp in my cpu in case my discrete crap out (if my GPU would do that ... i never needed more than once .... for re flashing a 6950)


----------



## R-T-B (Jul 26, 2015)

GreiverBlade said:


> you have the right to be nostalgic but being realist is more important.



It's not nostalgia.  It's that the money to performance ratio just isn't there.  Another video card would've served me far better.

Mind you, I was going from a 990x i7, not quite the same as the 920...  Same damn core though minus AES instructions.


----------



## GreiverBlade (Jul 26, 2015)

R-T-B said:


> It's not nostalgia.  It's that the money to performance ratio just isn't there.  Another video card would've served me far better.
> 
> Mind you, I was going from a 990x i7, not quite the same as the 920...  Same damn core though minus AES instructions.


well then if it was a 990X i understand

tho i still find the fact of being deceived delusional since even if you went from the higher 1366 to the lowest 2011-v3, even if you took the lowest it should still yeld substantial advantage over ... (one beingt the PRICE... i mean com'on people still try to sell the 980 and 990 above a 5820K price ... ) well the 990X was a bit more efficient (5%) on the tdp but at 999$ versus 389$ i don't think a 5820K has a worse price to perf ratio ..., roger?

apple to apple a 990X successor is a 5960X for me


----------



## Aquinus (Jul 26, 2015)

Caring1 said:


> They do, they're called XEON!


There are Xeon's with iGPUs.


R-T-B said:


> More like HEDT


This. More like a real server platform that has use no use for an iGPU. Most consumers will probably use an iGPU so it makes sense that most mainstream CPUs have an iGPU on them.


----------



## R-T-B (Jul 26, 2015)

GreiverBlade said:


> well then if it was a 990X i understand
> 
> tho i still find the fact of being deceived delusional since even if you went from the higher 1366 to the lowest 2011-v3, even if you took the lowest it should still yeld substantial advantage over ... (one beingt the PRICE... i mean com'on people still try to sell the 980 and 990 above a 5820K price ... ) well the 990X was a bit more efficient (5%) on the tdp but at 999$ versus 389$ i don't think a 5820K has a worse price to perf ratio ..., roger?
> 
> apple to apple a 990X successor is a 5960X for me



990x was purchased years later at well below msrp (I got it for about $350)...  Upgraded from a 920 then.

Don't get me wrong, I don't feel decived, I just feel Intel could be doing a lot more with this die shrink given proper competition...


----------



## GreiverBlade (Jul 26, 2015)

R-T-B said:


> 990x was purchased years later at well below msrp (I got it for about $350)...  Upgraded from a 920 then.
> 
> Don't get me wrong, I don't feel decived, I just feel Intel could be doing a lot more with this die shrink given proper competition...


tho Skylake is not a die shrink .... broadwell was, 

ok i get your meaning and i agree.


----------



## wagana (Jul 26, 2015)

I hope it doesn't cost the same as broadwell...


----------



## Dbiggs9 (Jul 26, 2015)

I feel a small need to upgrade from a 990X.


----------



## AluminumHaste (Jul 26, 2015)

RejZoR said:


> The 6700K CPU should be a hexacore. It's 2015 and they still consider a quadcore to be "enthusiast" level. C'mon, really!? I see ZERO point in switching and I have a Core i7 920. Only thing that I'd realistically gain is power consumption and some new instructions. Do your math how long can I use my existing i7 920 to justify the price difference in electricity bills...
> 
> From what I've checked, everything is identical. Cache sizes, core count, thread count etc. Hell, I even have triple channel on my ancient grunt and Skylake is only dual channel. Like ugh!? Totally pointless product. It only makes sense if you don't have a computer and you're buiying from scratch. Or you have some shitty dual core from 10 years ago...



I'll have to disagree with you, I upgraded from i7 920 like you which I had overclocked to 3.8 Ghz on all cores. When I got my 4770k, at stock it was faster in almost every game. Same video card 2 x Radeon 7950 in Crossfire.
Once I overclocked the 4770k to 4.4 Ghz, it became about twice as fast so it's not a linear comparison.
And then I got a 780ti and haven't had to think about upgrading yet.

I can't imagine going from 920 to 6770, that would be a huge increase.

Clock for clock, I had no decrease in performance going from triple channel to dual channel either, EXCEPT when I mad a RAM Drive, but then again, what's the difference between 8GB/s to 3GB/s? That sounds like a lot, but it made 0 difference in load times on the games I played on that ram Drive.


----------



## RejZoR (Jul 26, 2015)

* cough * bullshit * cough *

Unless you take only most CPU bottlenecked games, you'll see no real difference.

Change entire platform for 800-1000 € and gain 2fps or buy a new graphic card for half that and gain 30+ fps... hm...


----------



## Nordic (Jul 26, 2015)

How does going from a 920 to a 4790k effect minimum fps. Average fps there would be no difference, but minimum there should be a noticeable difference. I mean noticeable in game. If you average 80 fps but drop to 40 fps that is bad, as compared to 85 fps average and 60fps minimum.


----------



## AluminumHaste (Jul 26, 2015)

RejZoR said:


> * cough * bullshit * cough *
> 
> Unless you take only most CPU bottlenecked games, you'll see no real difference.
> 
> Change entire platform for 800-1000 € and gain 2fps or buy a new graphic card for half that and gain 30+ fps... hm...



Well it's not bullshit, but I did have to get a new Motherboard, Processor, RAM etc. So other efficiencies were also added in which probably contributed to the increase in performance. 
Same video cards though, those beasts lasted me a long time, loved those Windforce 7950s.


----------



## RejZoR (Jul 26, 2015)

james888 said:


> How does going from a 920 to a 4790k effect minimum fps. Average fps there would be no difference, but minimum there should be a noticeable difference. I mean noticeable in game. If you average 80 fps but drop to 40 fps that is bad, as compared to 85 fps average and 60fps minimum.



I've tested it easily. Stock i7 920 and overlocked i7 920. And I could hardly notice any difference until game started hammering CPU very hard. Just a mild bump to 3,2GHz and i frankly couldn't notice much difference compared to 4,2GHz. I basically run it at this clock just because I can. No other reason.


----------



## AluminumHaste (Jul 26, 2015)

RejZoR said:


> I've tested it easily. Stock i7 920 and overlocked i7 920. And I could hardly notice any difference until game started hammering CPU very hard. Just a mild bump to 3,2GHz and i frankly couldn't notice much difference compared to 4,2GHz. I basically run it at this clock just because I can. No other reason.



We're not talking about the same thing though, I'm not talking about just overclocking the 920, I'm talking about changing to a new CPU architecture. The difference between the 920 and 4770k at the same clock speeds, was huge. (Exaggerated a little)


----------



## RejZoR (Jul 26, 2015)

If i7 920 at clocks beyond 3,2GHz shows very little difference, what makes you think 4770k would make any difference? It's still just a CPU doing the exact same workloads as 920... It's not like these new CPU's are so incredibly radical for them to make a difference by that alone. Realistically that happened with Nehalem when they re-introduced HT which was pretty much useless on P4's because there was no software to really use it. HT is a very radical approach at doing computation. We haven't seen anything similar since...

There are various specualtions about eDRAM on Skylakes (probably Skylake-E variants), but no one really knows how they'll use it. Could be just for iGPU and wouldn't make any difference for anything else or they might use it for general CPU tasks and that might be interesting. But other than that, nothing really new worth mentioning. Die shrinks are like "meh" these days really. It's nice to have it, but essentially yawn inducing...


----------



## AluminumHaste (Jul 26, 2015)

There's a lot more to CPUs than just clock speeds man......you know what I do have the 920 at home, I'll try it tonight with some CPU benchmarks.


----------



## RejZoR (Jul 26, 2015)

I don't care about synthetic benchmarks. I care about real world performance and differences you can actually see and feel.


----------



## AluminumHaste (Jul 26, 2015)

*EDIT: Just realized I have an i7 930, not 920. Sorry about that.*



RejZoR said:


> I don't care about synthetic benchmarks. I care about real world performance and differences you can actually see and feel.



Hey man I'm right with you there, I want to know if it was all in my head or was real. So I did start with a synthetic benchmark, I just used the benchmark tool built into ThrottleStop. Both processors are running all cores/ht at 3.8Ghz.

i7 930:





i7 4770k:





So for that benchmark, it went from almost 12 seconds to 7 seconds. Not bad.

Here I'm trying waframe with everything graphical turned down (including resolution) to keep GPUs out of the equation as much as possible.

i7 930:





i7 4770k:





If you check the FPS counter at the bottom, 930 is around 548 fps, while the 4770k is around 847 fps.
Almost double the performance. Okay so it wasn't just me, there is a massive difference.

*EDIT 2: *Even though I had everything graphical turned down, the GPUs were still being used a lot. The Radeon 7950 with the i7 930 was at 45% usage, and the 780ti with the i7 4770k was at 32% usage, so take those results with HUGE grain of salt man.


----------



## Nordic (Jul 26, 2015)

I really applaud the effort to actually do some benchmarks. That is a 55% boost in performance clock for clock. That is very nice.

I don't know if warframe is the best game for testing if it is noticeable. Yes it shows a real world boost, but you were already at 550 fps. Anything more is not even noticeable. I don't know of any game, but it would be more ideal to have a game that was getting low fps with the 930 and 55% more fps with the 4770k.


----------



## happita (Jul 27, 2015)

AluminumHaste said:


> If you check the FPS counter at the bottom, 930 is around 548 fps, while the 4770k is around 847 fps.
> Almost double the performance. Okay so it wasn't just me, there is a massive difference.
> 
> *EDIT 2: *Even though I had everything graphical turned down, the GPUs were still being used a lot. The Radeon 7950 with the i7 930 was at 45% usage, and the 780ti with the i7 4770k was at 32% usage, so take those results with HUGE grain of salt man.



While it is important to note that the difference is there, the gap between those 2 CPUs close in when the graphical settings are set to realistic points. Now, I'm not even sure to guess how many people who get CPUs like the 4770k/4790k will have a good GPU to accompany it, but using that as a base, if graphical settings were scaled properly when comparing games of noticeable grunt, the CPU does little to increase a playing experience for games that are GPU-bound (Ex. Crysis 3, Far Cry 4, Dragon Age). Hard to justify a CPU upgrade when it will only net you ~10% gain while spending $300+.

I feel like CPUs are no longer a factor in this day & age when considering a PC gaming build. Seems like even if you get a low-end/mainstream CPU and just give it a small OC, it will keep you going for a long time. Case and point, my 2500k has not slowed me down one bit. I might wait for Cannonlake or Zen depending on how AMD plays it's cards this time around.


----------



## geon2k2 (Jul 27, 2015)

RejZoR said:


> * cough * bullshit * cough *
> 
> Unless you take only most CPU bottlenecked games, you'll see no real difference.
> 
> Change entire platform for 800-1000 € and gain 2fps or buy a new graphic card for half that and gain 30+ fps... hm...




You're absolutely right.
I've upgraded recently from a Phenom II X4 to a Haswell i5, and although I get ~40% more performance in pure CPU synthetics, 3dmark score has hardly moved and I don't feel any significant improvement in games. There is some improvement during scenes where frame drops, if you usually game with FRAPS on, but nothing which you could notice if FRAPS would be off, and drops and stutter if they were there before they are still there even after the upgrade.

Even badly coded games, which rely heavily on single thread performance, like Starcraft 2, which was the main reason for which I went i5 don't feel that much better. (Badly coded because this game uses ~70% from one core and 40% from another ... rest of them relax doing nothing)

Anyway I upgraded mostly because my platform was very old, because I could find good looking settings which got me over 60 fps in all the games even on the phenom.

BTW I also have a 7850, which I want to upgrade, but again it is because of its age and because at 1GB it has quite low memory, not because it doesn't give me good performance, in fact it is still a monster running flawlessly everything I throw at it.

One more thing about the memory, I don't see how an iGPU even with 128 EDRAM or whatever can compete with a card with dedicated memory. This iGPU stuff is very good for laptops, otherwise for desktop it is completely useless and any reasonable gamer will eventually get a discrete card.

This makes this intel iGPU push very stupid from my perspective, from the same die size they could make 2, maybe 3 CPUs, (even with slower 2d/3d integrated graphics) and sell them a bit cheaper and with much better margins.


----------



## Ubersonic (Jul 27, 2015)

RejZoR said:


> Show me the difference in games between i7 920 at 4,2GHz and that Skylake. It'll probably be identical. Paying premium for 3 seconds less in 7zip compression, I couldn't care less...



Funnily enough those benchmarks aren't out yet, however due to the performance increase they will certainly not be identical.  I myself upgraded from a 920 at 4Ghz to a 4930k at 4.5Ghz, and the difference was noticeable before I even overclocked the 4930K.  CPU limited games like WoW benefitted and allowed me to raise settings, GPU limited games only saw a minor increase in max FPS however the average FPS took a noticeable bump, again allowing settings to be raised.


----------



## RejZoR (Jul 27, 2015)

If you're getting 500+ fps in a game, of course you'll see huge differences. It's EVERYTHING down to CPU to make ANY more difference. You're already at such ridiculously high framerate it doesn't even matter anymore if it's 500 or 1500 fps.


----------



## Fx (Jul 27, 2015)

john_ said:


> Cheap prices where a necessity for AMD most of the time. Even when they had faster processors, Intel was controlling the OEMs. Every OEM, or retail store was selling Pentium 4s. After that the Phenom processor wasn't that fast and the Bulldozer architecture a pure disaster. So how can you go out and charge equally or more? The competition is controlling the OEMs, the retail stores, the press. People are used in blaming AMD for the same things they will find plenty of excuses for Nvidia or Intel. When everything is against you, can you really expect to empty your warehouses with prices that are not ultra competitive? Fury X come out at the same price as 980Ti and guess what. Everyone was looking the second decimal on the fps counter to say that the card was a failure. Suddenly a pump noise was compared to a jet engine and tech sites rush to write articles about how AMD failed there. And they rushed because it was already known that the problem was fixed. When everyone is pointing a gun at you, can you really charge extra?



Boom. There it is.


----------



## krimetal (Jul 27, 2015)

> Moving on to CPU, and the performance-increase is a predictable 10-20% single/multi-thread CPU performance, over "Broadwell." This is roughly similar to how "Haswell" bettered "Ivy Bridge," and how "Sandy Bridge" bettered "Lynnfield."



I don't recall Haswell beating Ivy by 10-20%, and neither Ivy beating Sandy the same way. Indeed Sandy was a big leap from Lynnfield, but after Sandy the progress was much smaller, it was more evident on the iGPU side.


----------



## RejZoR (Jul 29, 2015)

I think I'll go with the upgrade anyway. I need to refresh my system since it's acting a bit funny lately and buying new LGA1366 board just doesn't seem viable at this moment (and hard to actually do since they are rather hard to come by these days)

A Core i7 6700K paired with 32GB of RAM, new AX 760i PSU and new case, probably Corsair Carbide Silent. I'll be keeping the graphic card, soundcard and HDD/SSD. Should be fine for quite some time even without any overclocking. If it'll last for 6 years like this one I'll be happy. Lets wait for august...


----------



## VicXander (Aug 17, 2015)

RejZoR said:


> Show me the difference in games between i7 920 at 4,2GHz and that Skylake. It'll probably be identical. Paying premium for 3 seconds less in 7zip compression, I couldn't care less...



Maybe you should consider yourself to watch this *Intel Skylake Core i7 6700K vs 4790K/3770K/2600K  Gaming Benchmarks* comparison. Well, I know that video is not directly compared i7 SkyLake to your i7 920, but seeing it faster than i7 3770k which I know is many times faster clock for clock than your 1st gen i7, you can notice the difference. 

And click this for Synth. benchmark from anandtech.
Well, don`t get me wrong, but I just want to make you know about how every generations of i7 improved it`s performance.


----------

