# 95W TDP of "Skylake" Chips Explained by Intel's Big Graphics Push



## btarunr (May 12, 2015)

Intel's Core "Skylake" processor lineup, built on the company's swanky new 14 nanometer fab process, drew heads to its rather high 95W TDP for quad-core parts such as the Core i7-6700K and Core i5-6600K, even though their 22 nm predecessors, such as the i7-4770K and the i5-4670K run cooler, at 84W TDP. A new leaked slide explains the higher TDP. Apparently, Intel is going all-out with its integrated graphics implementation on Core "Skylake" chips, including onboard graphics that leverage eDRAM caches. The company is promising as much as 50% higher integrated graphics performance over "Haswell." 

Although the chips have high rated TDP, the overall energy efficiency presents a different story. SoCs based on "Skylake" will draw as much as 60% lower power than "Haswell" based ones, translating into 35% longer HD video playback on portable devices running these chips. Intel's graphics performance push is driven by an almost sudden surge in display resolutions, with standards such as 4K (3840 x 2160) entering mainstream, and 5K (5120 x 2880) entering the enthusiast segment. Intel's design goal is to supply the market with a graphics solution that makes the two resolutions functional on desktop and video, if not gaming.





*View at TechPowerUp Main Site*


----------



## btarunr (May 12, 2015)

This also means that on systems with discrete graphics (power to IGP gated), TDP of the chips will be lower than Haswell.


----------



## NC37 (May 12, 2015)

And had AMD not made APUs and forced Intel to get off their butts, 4k would have come and they'd be pissing their pants as they scramble to make graphics that don't suck.

So literally, thank you AMD for making this day happen. Now you just better deliver with Zen or you'll have no way to flaunt your APUs in front of Intel anymore.


----------



## dj-electric (May 12, 2015)

I wont give too much credit to AMD for this, as intel integrated GPUs in their systems a long time ago on mobos, and on CPUs ever since 2009. They are not late to the game in any way.

Granted, the power of those GPUs wasn't that good up until HD2500, and with what i persume as a R7 250-like performance on top skylake CPUs, the simple assumption of "all APUs have better graphics horse power than any intel CPU" is goign to change. A lot.


----------



## NC37 (May 12, 2015)

Yeah Intel did it for awhile but their GPUs sucked till AMD started putting Radeons with their crap CPUs and calling them APUs. Intel has been the laughing stock of graphics for a long time. Heck their answer to discrete GPUs was to link a bunch of x86 CPUs together and call it a GPU. I know some people are upset that product was canceled because they all talked about how it would revolutionize how games were made. Me, I'm glad it was canned because Intel had no idea how to make GPUs. They showed off the tech for years then finally when they started getting serious they realized any performance it had was long gone.

Competition is good and when you've got AMD gloating about having discrete level graphics on a CPU, it's like a little monkey poking a gorilla with a tiny poop stick. It might not notice right away but eventually its gonna turn around and swat it.

Before APUs, the last time I remember Intel being proud of a new GPU was back with the GMA950 and that was a laughing stock when it launched. Now since APUs, they've been actually putting forth effort. I honestly was impressed the HD4000 in my laptop actually could run some things. I was thinking it would be utter crap compared to the 660M in it, and well it is, but it was better than expected. 

They can't rival AMD or nVidia on high end but its good to see them finally putting forth effort into graphics. Just imagine if Intel got real serious about graphics and decided to enter the GPU war. They got the R&D, fabs, and all the brains to be able to do it.


----------



## lZKoce (May 12, 2015)

NC37 said:


> Just imagine if Intel got real serious about graphics and decided to enter the GPU war. They got the R&D, fabs, and all the brains to be able to do it.



Spot on. I think, capacity speaking they have the biggest infrastructure compared to any other GPU manifacturer right now. I still like the looks of Larrabee )


----------



## Wark0 (May 12, 2015)

This doesn't explain the 95W TDP, this TDP will be only for the K and the K doesn't have GT4e.


----------



## Joss (May 12, 2015)

How many buy high end i5/i7 and use the integrated graphics?
A family of APUs and other of CPUs would make more sense.


----------



## Debat0r (May 12, 2015)

I really think Intel isn't really doing this graphics thing all that well. Sure, they got good graphics with the new iris series, but they only put them in high-end chips, which is the market that almost exclusively uses dedicated GPUs. Are they really that stupid? To challenge AMD's APU might they need to push out a Pentium or i3 with this kind of GPU, not an i7...


----------



## tehehe (May 12, 2015)

Biggest problem as far as Intel GPUs go is terrible driver support. Maybe DX12/Vulkan will change that assuming most games will be written with these apis.


----------



## lZKoce (May 12, 2015)

Debat0r said:


> I really think Intel isn't really doing this graphics thing all that well. Sure, they got good graphics with the new iris series, but they only put them in high-end chips, which is the market that almost exclusively uses dedicated GPUs. Are they really that stupid?



I don't think they are, because you still gonna pay for the whole chip and they collect your toll for the iGPU regardless of whether you will use it or not. So if you want to have a high end CPU from Intel, you pay for the iGPU as well it seems so.


----------



## RejZoR (May 12, 2015)

Intel had GPU's for ages, but they were absolute garbage until AMD forced them to do something. They are still rather rubbish, but at least they improved them significantly.


----------



## techy1 (May 12, 2015)

I wish Intel would stop doing iGPUs... cuz I do not need them... I will always choose mid-top range discrete GPU (and still - even a low end GPUs are far more capable than iGPU)  - so that iGPU is waste of my money, time and TDP (Intel is like: "oh - look brand new iGPU +50% over previous gen, almost 4k capable - we did this all for you buddy, no matter that you do not and will not use iGPU - but our team worked 2 years (and in that time made only like 0-5% CPU gain) to give you this experience you will never use or need - lucky you, now pay extra for our useful work". I would choose, for that extra $ and extra TDP a +1% CPU power than +50% iGPU .. But there is no such a option (never was and never will be - and that sux).


----------



## Aquinus (May 12, 2015)

lZKoce said:


> I don't think they are, because you still gonna pay for the whole chip and they collect your toll for the iGPU regardless of whether you will use it or not. So if you want to have a high end *mainstream* CPU from Intel, you pay for the iGPU as well it seems so.



I updated your comment. As I've said in the past, skt1150 is Intel's mainstream platform not a HEDT platform like 2011-3. As a result, like APUs, there is an expectation for there to be graphics on chip. So while Intel might have "fast" or "high end" CPUs for the socket, it doesn't change the fact that it's still a mainstream consumer platform, much as AMD' APU lineup are.


NC37 said:


> Heck their answer to discrete GPUs was to link a bunch of x86 CPUs together and call it a GPU.


That's because only nVidia and AMD have the rights to shader technology. Intel was forced to use x86-like cores to do GPU because that's what they had access to.


Debat0r said:


> I really think Intel isn't really doing this graphics thing all that well. Sure, they got good graphics with the new iris series, but they only put them in high-end chips, which is the market that almost exclusively uses dedicated GPUs. Are they really that stupid? To challenge AMD's APU might they need to push out a Pentium or i3 with this kind of GPU, not an i7...


I do? My laptop has an Iris Pro in it and it will do everything from work to 4k video. I have to say Iris Pro is a lot better than all of the other iGPU Intel has conjured up.


lZKoce said:


> Spot on. I think, capacity speaking they have the biggest infrastructure compared to any other GPU manifacturer right now. I still like the looks of Larrabee )


Intel can't compete in that market unless they built an entirely new GPU architecture from scratch. Once again, Intel doesn't own the rights to make shaders where AMD and nVidia do. That alone complicates matters.


techy1 said:


> I wish Intel would stop doing iGPUs... cuz I do not need them... I will always choose mid-top range discrete GPU (and still - even a low end GPUs are far more capable than iGPU)  - so that iGPU is waste of my money, time and TDP (Intel is like: "oh - look brand new iGPU +50% over previous gen, almost 4k capable - we did this all for you buddy, no matter that you do not and will not use iGPU - but our team worked 2 years (and in that time made only like 0-5% CPU gain) to give you this experience you will never use or need - lucky you, now pay extra for our useful work". I would choose, for that extra $ and extra TDP a +1% CPU power than +50% iGPU .. But there is no such a option (never was and never will be - and that sux).


Most people don't use computers like people at TPU do. Intel has a market to satisfy and most of it wants iGPUs. Intel doesn't really give two shits what we want out of a mainstream platform. If you told that to Intel, they would probably laugh at you and tell you to go skt2011-3 if you so bent out of shape about it. Simple fact, most consumers don't need a discrete graphics card, at least not right off the bat if they ever will. Mainstream platform means mainstream features, if you don't like it, then don't buy it. Also Intel's IPC has increase more than 1% and squeezing performance out of a core that they've already squeezed a lot out of is a tedious task. Maybe you should go help Intel design a new one... 

Also Iris Pro can drive 4k video on a 4k display. I know because I've done it and it works great. Is it good for gaming, not really, but is it good for everything except gaming or maybe even a bit of light gaming? Sure. Just remember, Intel makes most of its money off businesses, not individual consumers, so it only makes sense that their products reflect the market share and a huge chunk of the market uses iGPUs or has no use for discrete graphics.


----------



## Yellow&Nerdy? (May 12, 2015)

I don't get why Intel even includes an iGPU on the unlocked models. I'm pretty sure no one buys unlocked processors and don't get a dedicated GPU.


----------



## RejZoR (May 12, 2015)

Because it's cheaper for them to just churn out all the same CPU's than designing two different designs, one with iGPU and one without.


----------



## techy1 (May 12, 2015)

Aquinus said:


> Most people don't use computers like people at TPU do. Intel has a market to satisfy and most of it wants iGPUs. Intel doesn't really give two shits what we want out of a mainstream platform. If you told that to Intel, they would probably laugh at you and tell you to go skt2011-3 if you so bent out of shape about it. Simple fact, most consumers don't need a discrete graphics card, at least not right off the bat if they ever will. Mainstream platform means mainstream features, if you don't like it, then don't buy it. Also Intel's IPC has increase more than 1% and squeezing performance out of a core that they've already squeezed a lot out of is a tedious task. Maybe you should go help Intel design a new one...
> 
> Also Iris Pro can drive 4k video on a 4k display. I know because I've done it and it works great. Is it good for gaming, not really, but is it good for everything except gaming or maybe even a bit of light gaming? Sure. Just remember, Intel makes most of its money off businesses, not individual consumers, so it only makes sense that their products reflect the market share and a huge chunk of the market uses iGPUs or has no use for discrete graphics.



Sure thing - i3 or Pentium buyers do not even know what letters GPU means and slogans like  "+50% mooaar... 4K" is good enough to buy for them... but what about i7 and "K"-series? - nobody of mainstream nor needs or understands  those benefits... why those series needs "+50% moaar 4K" slogans? cuz those clients (who needs and or understands "i7") should only be pissed of by reading something like that. (I believe that 4k video is fine for that new iGPU... I was positively surprised that my netbook with i3 Sandy back in its the days could run 1080p.. but then again - the keywords are: "netbook", "i3", "no discrete" and not: "i7-k", "Z170", "above 200$-range GPU")


----------



## Lionheart (May 12, 2015)

I'm all for better Intel GPU's but like some of you guys mentioned, kinda feels like a waste when most PC enthusiasts will be using a dedicated GPU but then again I can see why too, Intel wanna compete & be above AMD on all levels...


----------



## Aquinus (May 12, 2015)

Lionheart said:


> kinda feels like a waste when most PC enthusiasts will be using a dedicated GPU


There is this thing called skt2011-3. Don't invest in a mainstream platform if you don't like the features, it's really that simple. Intel just does the "one size fits all" thing for 1150 and for the vast majority of people, that's fine.


techy1 said:


> Sure thing - i3 or Pentium buyers do not even know what letters GPU means and slogans like "+50% mooaar... 4K" is good enough to buy for them... but what about i7 and "K"-series? - nobody of mainstream nor needs or understands those benefits... why those series needs "+50% moaar 4K" slogans? cuz those clients (who needs and or understands "i7") should only be pissed of by reading something like that. (I believe that 4k video is fine for that new iGPU... I was positively surprised that my netbook with i3 Sandy back in its the days could run 1080p.. but then again - the keywords are: "netbook", "i3", "no discrete" and not: "i7-k", "Z170", "above 200$-range GPU")


Then by your logic, no one on a mainstream platform cares about overclocking either. The problem is that statement is false. An i7 is a performance model CPU for a *mainstream platform*. No one ever said anything about the i7 being a mainstream CPU, but it doesn't change the fact that this socket is intended to reach some of the lowest price points (by Intel's standards) and the highest, however it always needs to be able to cater to the lowest because that's where the bulk of the sales are, not with enthusiasts wanting stuff just a certain way. If you're really that bent out of shape, then don't buy the damn socket and just go with Intel's HEDT platform instead.

Another way of putting it is: Enthusiasts are a very small niche and most "enthusiasts" don't even want to spend much money anyways, so you'd be stuck with a mainstream platform anyways by virtue of your budget. So no, this is Intel making a one-size-fits-all. If you're not happy with it, you have options, it's called Haswell-E. I felt a similar way when I upgraded 3 years ago, hence why I have a 3820 and not a 2600k.


----------



## Prima.Vera (May 12, 2015)

Why the fk do I need strong GPU on a i7-K processor is beyond my comprehension! Usually people buying i7s are buying for gaming and multimedia. I only need very basic GPU and that's it. HUGE WASTE of transistors, therefore also big arse TDP. Seriously, sometimes I think those managers from Intel are worst than monkeys.


----------



## Aquinus (May 12, 2015)

Prima.Vera said:


> Why the fk do I need strong GPU on a i7-K processor is beyond my comprehension! Usually people buying i7s are buying for gaming and multimedia. I only need very basic GPU and that's it. HUGE WASTE of transistors, therefore also big arse TDP. Seriously, sometimes I think those managers from Intel are worst than monkeys.


...because it's a *mainstream *platform, skt1156/1155/1150 are all mainstream platforms with a full lineup of CPUs from entry to performance. Cool your jets and calm down. Maybe you should go work for them and design a new CPU if you shit don't stink. If you're really that bent out of shape about the platform then get a HEDT platform and stop complaining.


----------



## TheinsanegamerN (May 12, 2015)

techy1 said:


> I wish Intel would stop doing iGPUs... cuz I do not need them... I will always choose mid-top range discrete GPU (and still - even a low end GPUs are far more capable than iGPU)  - so that iGPU is waste of my money, time and TDP (Intel is like: "oh - look brand new iGPU +50% over previous gen, almost 4k capable - we did this all for you buddy, no matter that you do not and will not use iGPU - but our team worked 2 years (and in that time made only like 0-5% CPU gain) to give you this experience you will never use or need - lucky you, now pay extra for our useful work". I would choose, for that extra $ and extra TDP a +1% CPU power than +50% iGPU .. But there is no such a option (never was and never will be - and that sux).


Go buy a six core i7 and stop complaining. Intel already made what you want. yeah, it cost way more, but guess what? a tiny market demands high prices. socket 1366 and 2011 exist for people like you.


----------



## Yellow&Nerdy? (May 12, 2015)

RejZoR said:


> Because it's cheaper for them to just churn out all the same CPU's than designing two different designs, one with iGPU and one without.


No doubt that is the case and the reason why Intel is doing it this way, but I wonder if they could possibly disable the iGPU on the unlocked models.


----------



## ensabrenoir (May 12, 2015)

.....silly enthusiasts(me included)..... business class and mainstream pays the bills.  In the all in one desktop,  laptop and  *N*ew *M*obile *W*orld *O*rder An i7 with a strong igpu is killer because its all about efficiency/battery life/small form factor/price.  Adding discrete gpus adds costs,heat,size and power draw....all minuses to the common professional female/male.  Successful business 101.....  know your market/customers(revenue source) and supply a product that they need/like and will use at a price they're willing pay.  And say what you will about Intel's prices......their continuing success still speaks louder.


----------



## Lionheart (May 12, 2015)

Aquinus said:


> There is this thing called skt2011-3. Don't invest in a mainstream platform if you don't like the features, it's really that simple. Intel just does the "one size fits all" thing for 1150 and for the vast majority of people, that's fine.



Lol there's also this thing called money. Why would I invest in a more expensive platform that doesn't benefit me in gaming other than more cores that don't get utilized (In before DX12 BS) and better PCI-E bandwidth for SLI & Crossfire???? Yeah makes total sense.. 

When did I say I don't like the features of Intel integrated GPU, I said *"kinda feels like a waste when most PC enthusiasts will be using a dedicated GPU but then again I can see why too, Intel wanna compete & be above AMD on all levels"* See! I said I understand why they're doing it.


----------



## techy1 (May 12, 2015)

Anyhow - decent discussion. But I highly doubt that even a pentium/i3 client buys PC without crappy discrete... because he has 0 choice there... if he walks in a PC shop (being uninformed as he is) then the sales guy will sale him like "GT610 with 4GB DDR3" (or something those lines) why? cuz he can (cuz client will not understand anyway) and why not take extra 50 $? The same is in online stores bunch of those low cost entry level PC is complete for complete unaware people, those complects are not completed to be most cost efficient - there is almost always some crappy few generations old discrete GPU packed in. so I do not belive that "if mainstream asks for iGPU, then mainstream gets it". but most of what pisses me off that when new CPUs come out - the new iGPU gets the most attention and CPU improvements gets less and less attention and guess what - less and less improvements. cuz there is no need - everything what tech people are talking is 1) how huge of improvment in iGPU department this generation will be and 2) how much mainstream needs it and how good that Intel got their iGPU on this high level


----------



## GhostRyder (May 12, 2015)

Well I am glad they are working on it though a lot of their improvements are going to be coming from the embedded ram improvements on the iris.  I do have to agree with some of the people on the choice of chips for these improved graphics.  Personally, I would like the mobile processors and some of the lower models to at least have an enhanced iGPU option even if it slightly added to the price (So long as they had others without it).


----------



## NightOfChrist (May 12, 2015)

Personally I believe the iGPU would be useful to me, should I be in the situation where the discrete graphics card is malfunction and I do not have a spare card when needed to use as a replacement or if there is a problem with the discrete graphics card and its driver that prevents boot to Windows. Intel HD Graphics would make things easier, especially if it has excellent performance. To me, It could not hurt to have Intel iGPU ready and it is great Intel has made improvements to its performance. I am looking forward to see what Skylake processors are capable of when they are released.


----------



## axxo22 (May 12, 2015)

Aquinus said:


> Cool your jets and calm down. Maybe you should go work for them and design a new CPU if you shit don't stink. If you're really that bent out of shape about the platform then get a HEDT platform and stop complaining.



Good lord. Mods, what does it take to get banned around here, and how has this flaming petulant jackass not managed to meat that criteria a thousand times over yet?


----------



## Casecutter (May 12, 2015)

NC37 said:


> I know some people are upset that product was canceled because they all talked about how it would revolutionize how games were made.


Intel was smart to realize they wouldn't change 3D gaming engines.  Probably figured it out when seeing AMD couldn't make software developers use multiple cores.


----------



## ozorian (May 12, 2015)

Aquinus said:


> If you're really that bent out of shape about the platform then get a HEDT platform and stop complaining.



LOL Do they pay you....!!!!!(intel)
Relax man he can have his own opinion..................!!!!
For god sake and btw i agree with him.
I dont give a shit what if i5/i7 k series are for mainstream platforms, i know that the majority of the buyers are gamers and performance users THAT THEY DONT NEED INTEGRATED GPU!!
About haswell-e which is without integrated gpu, i would choose it if i had more than 3% on gaming performance(4790k vs 5820k).When i dont have that 3% and in order to get it,i must double my budget for a 5960x sorry but "NO THANKS"

PS: sorry for my english :S


----------



## Ravendagrey (May 12, 2015)

ozorian said:


> LOL Do they pay you....!!!!!(intel)
> Relax man he can have his own opinion..................!!!!
> For god sake and btw i agree with him.
> I dont give a shit what if i5/i7 k series are for mainstream platforms, i know that the majority of the buyers are gamers and performance users THAT THEY DONT NEED INTEGRATED GPU!!
> ...



Your argument means nothing... you can spend a measly 50 bucks and get the 5820k over a 4790k get 2 more cores (4 more threads), more cache, ddr4 memory, plus all the other goodies on the x99 platform and considerably more performance in multithreaded tasks (including games that take advantage of cores and with technologies like DX12, Mantle, and Valkan that offload tasks to the CPU) and a far more future-proof system, WITHOUT an iGPU. If you try to argue that x99 motherboards are considerably more expensive and you really should try comparing specs between a low end x99 board vs z97 and you'll discover that for the same price point you're going to get pretty much the same features...


----------



## Aquinus (May 12, 2015)

Lionheart said:


> Why would I invest in a more expensive platform that doesn't benefit me in gaming other than more cores that don't get utilized


Because you're complaining about Intel's mainstream platform having mainstream features. Enthusiasts pay for what they want, quit your complaining and move on.


Lionheart said:


> kinda feels like a waste when most PC enthusiasts will be using a dedicated GPU but then again I can see why too, Intel wanna compete & be above AMD on all levels


Except it's more wasteful for Intel to redesign to not have an iGPU when most consumers do in fact have and use iGPUs. You're not everyone and Intel doesn't care about you unless you pay for it. Get over it, nice stuff doesn't come cheap.


ozorian said:


> I dont give a shit what if i5/i7 k series are for mainstream platforms, i know that the majority of the buyers are gamers and performance users THAT THEY DONT NEED INTEGRATED GPU!!


Welcome to the consumer market. Gamers are niche and Intel doesn't care about you because they make their money off everyone else and businesses. As far as profit is concerned, they could care less what you think. Simple point is that Intel is going to put an iGPU in every CPU because when it comes to making the CPU itself, it's cheaper because it doesn't require a redesign. Even the CPUs "without iGPUs" on the same socket still have iGPU circuitry in it that's laser cut for one reason or another. This comes down to business for Intel. So stop acting like gamers are super important because Intel, honestly, doesn't really care because that isn't where they make money. So if you consider the market they target and the features they provide, it makes sense. They're not going to custom make a set of CPUs on a gimped platform just to cater to a relatively small portion of the market, and when they do, it ends up in the HEDT lineup and you pay for it. So stop complaining about not getting exactly what you want on a mainstream platform.

You bet I'm an asshole, anyone here at TPU knows that I'm not afraid to speak my opinion.


axxo22 said:


> Good lord. Mods, what does it take to get banned around here, and how has this flaming petulant jackass not managed to meat that criteria a thousand times over yet?


You really registered just to say that? How about not trying to throw the thread off topic. 
Everyone already knows I'm an ass, you don't need to point out the obvious.

In summary: Welcome to the mainstream market. It demands iGPUs so Intel includes it on most of their GPUs and have built the PCH around having that iGPU. I personally think it's more dumb to leave out an iGPU when you have all this dedicated crap in your motherboard for it. It's even more dumb to use the power argument because Intel has been power-gating iGPUs since Sandy, so it's not like it even adds to the heat when you have a discrete card. I personally find the argument of leaving the iGPU out amusing since most people with it, use it. Just because your a gamer doesn't mean the entire market is full of PC gamers (even if it should be.)

Lastly, I would prefer having an iGPU powerful enough to do everything I want with it instead of a discrete GPU that's overkill. It depends on what you're using it for, and most people use Facebook, look at email, video, and do everything that isn't 3D acceleration. So if Intel is thinking about how to make more money, I can bet you they're not thinking about catering to gamers.

Side note: I'm on a laptop with an Iris Pro in it now and it works fine for everything that isn't gaming.


----------



## RejZoR (May 12, 2015)

Yellow&Nerdy? said:


> No doubt that is the case and the reason why Intel is doing it this way, but I wonder if they could possibly disable the iGPU on the unlocked models.



They probably design a logic where iGPU part gets entirely shut down, so it doesn't consume power or generate any heat. That is done through BIOS or through the chipset on the motherboard or something.


----------



## theonedub (May 12, 2015)

When I was actively Folding I would use the iGPU on my K series so the PC could actually still be usable when the dedicated is loaded @ 100%. 

Its also nice for when you want to run a super slim mITX and need CPU power, but have no desire (or room) for a dedicated GPU. That way you can still have accelerated video playback, fast encoding, etc in that mini case.


----------



## Lionheart (May 12, 2015)

Aquinus said:


> Because you're complaining about Intel's mainstream platform having mainstream features. Enthusiasts pay for what they want, quit your complaining and move on.
> 
> Except it's more wasteful for Intel to redesign to not have an iGPU when most consumers do in fact have and use iGPUs. You're not everyone and Intel doesn't care about you unless you pay for it. Get over it, nice stuff doesn't come cheap.
> 
> ...




I say one little thing about an Intel iGPU & you flip your shit ,the only one here complaining here is you! Bitching about other ppl's opinions cause you don't like them, so how about you quit your "complaining" & move on! Complaining about other ppl complaining?? Lol

"*kinda feels like a waste when most PC enthusiasts will be using a dedicated GPU but then again I can see why too, Intel wanna compete & be above AMD on all levels*" For the 3rd time already did you not read my comment? I said I understand why? Get it through your dense head.

Fixed one of your sentences, call me immature but I don't care, this site needs more humour & less ego.


----------



## Aquinus (May 13, 2015)

Lionheart said:


> I say one little thing about an Intel iGPU & you flip your shit ,the only one here complaining here is you! Bitching about other ppl's opinions cause you don't like them, so how about you quit your "complaining" & move on! Complaining about other ppl complaining?? Lol
> 
> "*kinda feels like a waste when most PC enthusiasts will be using a dedicated GPU but then again I can see why too, Intel wanna compete & be above AMD on all levels*" For the 3rd time already did you not read my comment? I said I understand why? Get it through your dense head.
> 
> Fixed one of your sentences, call me immature but I don't care, this site needs more humour & less ego.


I think you're missing my point. It's wasted for people who actually use a discrete card and even that, it's still functionality you wouldn't otherwise have. The simple point, which I've made time and time again is that does *not describe most consumers*, also it's not really a waste when it's something you wouldn't have had and you even said yourself, why invest more money in a platform you don't need. My point is complaining about features common on a mainstream platform because someone claims to be an enthusiast is asinine. I quote you because you're perpetuating that very problem just as I've quoted others who are doing the same thing. The iGPU is not a disadvantage and people make it sound like it is when for the bulk of users, it's doing them a world of good. That's my problem, but once again, I've been making the same point over and over again. Intel isn't doing it because AMD does, they do it because the market demands it.

Also making comments like this only serve to perpetuate an argument outside the realm of the topic of the thread and I will have no part of that.


Lionheart said:


> call me immature but I don't care, this site needs more humour & less ego.


----------



## alucasa (May 13, 2015)

The iGPU is pretty handy in emergency situations where your GPU simply decides to call it quits. You will have something to work with while a replacement is coming along.

And for those who don't play a lot of games but need a good CPU, it saves some bucks not having to buy a dedicated GPU as well.


----------



## tomkaten (May 13, 2015)

RejZoR said:


> Because it's cheaper for them to just churn out all the same CPU's than designing two different designs, one with iGPU and one without.



Not really, it seems, SB, IB and Haswell all had Xeons(4 cores, hyperthreading) that cost less than their identical I7 counterparts with integrated GPUs.

Make no mistake, you are paying Intel about 50$ for something you will probably never use. It's borderline unfair practice IMO, because you're not given a choice.

OTOH, the cost of R&D for great improvements in this field from one generation to another is probably big, so Intel needs to absorb that by selling integrated GPU's to the masses. That's the best personal justification I can come up with.


----------



## xenocide (May 13, 2015)

tomkaten said:


> Not really, it seems, SB, IB and Haswell all had Xeons(4 cores, hyperthreading) that cost less than their identical I7 counterparts with integrated GPUs.



They also had varying feature sets and locked clocks.  You pay for features.  You're also kind of agreeing with his point, they just made a certain design and perfected that design--the Xeon variations of things like SB/IB/Haswell are basically i7's with a few features moved here and there.  Cost saving measure.


----------



## Aquinus (May 13, 2015)

tomkaten said:


> Not really, it seems, SB, IB and Haswell all had Xeons(4 cores, hyperthreading) that cost less than their identical I7 counterparts with integrated GPUs.
> 
> Make no mistake, you are paying Intel about 50$ for something you will probably never use. It's borderline unfair practice IMO, because you're not given a choice.


Well, if you bought a Xeon but don't use VT-d, vPro, ECC memory, or any of the features that Xeons have to offer, then shame on you for buying something you don't need. You're not paying more for the iGPU, you're paying more for the features of a Xeon. Also many Xeons do have integrated graphics, there are more with than without IIRC.


----------



## tomkaten (May 13, 2015)

Aquinus said:


> Well, if you bought a Xeon but don't use VT-d, vPro, ECC memory, or any of the features that Xeons have to offer, then shame on you for buying something you don't need. You're not paying more for the iGPU, you're paying more for the features of a Xeon.



You misread what I said, Xeons are actually CHEAPER, although they support ECC and TSX-NI over their i7 equivalents.

Case in point:

i7 4770:

http://www.newegg.com/Product/Product.aspx?Item=N82E16819116900&cm_re=i7_4770-_-19-116-900-_-Product

Xeon E3:

http://www.newegg.com/Product/Product.aspx?Item=N82E16819117316&cm_re=xeon_e3-_-19-117-316-_-Product

It's basically the same CPU, but the Xeon is $50 cheaper.


----------



## Aquinus (May 13, 2015)

tomkaten said:


> You misread what I said, Xeons are actually CHEAPER, although they support ECC and TSX-NI over their i7 equivalents.


Oh, sure. When you rule out overclocking, sure, but keep in mind that Xeon you linked is newer than the 4770 and that the Xeon is closer in performance to a 4790, not a 4770 (a quick google search tells me benchmarks are slightly higher on the Xeon despite having a boost that 100Mhz lower than the 4770. You compare the DC CPU instead and you get the equivalent Xeon, it ends up costing only 10 dollars cheaper than the regular 4790. So while you're right, Xeon can cost less but it depends on what you're comparing... but still you throw overclocking out the Window when you do that and a Xeon isn't really an option if you want to overclock. Then if you go to the far end of the spectrum and compare the highest clocked i7 versus the highest clocked Xeon, and you'll be paying more for the Xeon. So it depends on what you're trying to do.

The E3 Xeon 1245v3 is probably a better comparison because the clocks are the same and both have iGPUs. Only the *cheapest* e3 Xeons don't have iGPUs iirc, not the most performant ones.

All in all, it's still a zero sum game unless there are particular needs for your system.

None of this changes the fact though that no one would really notice a difference if the iGPU was there or not and if you're already spending >500 USD, 10 dollars won't make a huge difference.


----------



## tomkaten (May 13, 2015)

We're beating around the bush here. My links prove that the Xeon with the same architecture and the exact clock speeds (ok, minus 100 MHz in single-core) is 50 bucks cheaper. How did you get to 10 bucks ? You lost me in your argument progression 

Now, $50 is pocket change for some and a lot of money for others, depending on where you hail from, but one thing is certain: it's better spent on a superior discrete card. Or more RAM... Or a higher capacity SSD, you name it. Especially if you're never gonna use that iGPU. It's better when I decide what my money goes into instead of Intel.

Direct comparison between the two:

http://ark.intel.com/compare/80910,75122


----------



## GreiverBlade (May 13, 2015)

well not to be bitchy or annoying.

they upped the TDP to 95w for that IGP yet it's still not on the level of a A10-7850K (100w)IGP level and the Kaveri IGP doesn't have eDRAM, yet it's still a bit closer than the HD4600 was ofc.

bear with the french language in the pics ... numbers are universal 



ok CPU side it's totally not the same case ... 

conclusion: if i want a cpu with a IGP and no need for a discrete, but keeping the Hybrid CFX option in mind... i go AMD and Kaveri instead of Skylake even in 2015/16 (or Godavari or the next APU since Kaveri is bound to a refresh soon)


----------



## 623 (May 13, 2015)

Intel Core i7-6700K(ES) CPU-Z


----------



## RejZoR (May 13, 2015)

tomkaten said:


> Not really, it seems, SB, IB and Haswell all had Xeons(4 cores, hyperthreading) that cost less than their identical I7 counterparts with integrated GPUs.
> 
> Make no mistake, you are paying Intel about 50$ for something you will probably never use. It's borderline unfair practice IMO, because you're not given a choice.
> 
> OTOH, the cost of R&D for great improvements in this field from one generation to another is probably big, so Intel needs to absorb that by selling integrated GPU's to the masses. That's the best personal justification I can come up with.



I've said it's cheaper for THEM. I never said it's cheaper for customers.


----------



## lilhasselhoffer (May 13, 2015)

Can somebody pop the popcorn?  This is getting to be interesting.


To the points raised thus far:
1) The inclusion of an iGPU increases the price of a processor.
Technically yes.  Realistically, no.  The inclusion of an iGPU is a design choice.  They start out with it, and it's due to their target audience.  Their main audience isn't gaming enthusiasts, it's business applications where multiple tabbed spreadsheets and flash presentations are the most graphically demanding things required.  For such usage an iGPU is a minor increase in expense, that pays off hugely with cutting out dedicated video cards.  We might not like it as gamers, but we're such a small market segment it is a moot point.

2) AMD competing with Intel, via the APU, is what spurred Intel's development of an iGPU
Nope.  Somehow people forget that development of hardware takes years.  If Intel was responding to the APU it would just be breaking the 1080p video barrier.  This is a move from Intel that was precipitated by ARM.  Their recent forays into tablet devices, along with the fact that they cite extra video playback time, is a dead give away.  Intel has already relegated AMD to the scrap bin, in no small part to the fact that AMD said they were pulling out of the CPU market.  The APU is good, but only because they strapped a functional GPU to a CPU.  

3) Intel graphics are crap (paraphrased)
Yeah, I can't argue that.  The salient point here is patent law.  Nvidea and AMD own a ton of key patents for modern graphics solutions.  As neither is looking to license that patent to Intel, they've got to reinvent the wheel.  In the span of less than a decade Intel has gone from buggy trash to competent for business use.  That's a huge leap, considering AMD and Nvidea took much longer to get there.  If you're in doubt of this argument I'd ask you to compare Duke Nukem 3d to Hitman: Blood Money.  That's one decade of difference, and you've still got some janky, boxy figures.  In comparison the Sandy Bridge iGPU (2011) has already gotten to competent 1080p playback and it's only from 2014.

4) You're a shill for Intel
I wish.  If I was paid for this crap I'd be able to enjoy a lot more.  As it stands, I'm hoping that Intel sinks too much into iGPU development, Zen is as good as suggested, and Intel gets caught with their pants down.  That would precipitate another genuinely great CPU generation, akin to the Sandy Bridge era.  Skylake is unlikely to do this, and from the sounds of it just be another 10-15% performance increase.  Hopefully this time it's without forfeiting overclocking ability.  Energy efficiency is great, but you can't sell several hundred dollars of silicon based on a 60% efficiency increase, when the net savings would require a system run for years before breaking even.

5) Intel including an iGPU is unfair
Simple response: buy something else.  I'm unaware of Intel possessing a monopoly.  You can buy a CPU from AMD, or perhaps a small fleet of ARM powered devices.  Want performance, then buy Intel.  It's crap to say, but it's reality.  If I want a fast car I pay an insane amount for a Veyron.  If I want a pickup truck I buy a Toyota.  I can't complain to Toyota that they don't make a budget super car.  What you're asking is that Toyota suddenly starts making super cars, when their pickup market represents 90%+ of the global market and prints money.  While an automotive analogy isn't perfect, it does highlight the absurdity of catering to a niche market, no?



I'm looking forward to how my words are misconstrued as Intel fanboyism.  What I appreciate is performance, and AMD can't do it.  If you pay the Nvidea tax you've acquiesced to this point.  Most important, reality seems to be against the counter argument.  Look at a Steam hardware survey, and most people use an Intel CPU and Nvidea GPU.  It seems as though the market has spoken.  While wishing for the glory days of the AthlonXP is reasonable, you have to deal with reality.  Right now, Intel could have a 0% performance increase with Skylake, focusing only on iGPU, and still make money.  Either understand that, or continue to argue that you are somehow special and deserving of a unique CPU.  The former is reality, with the later being fantasy bordering on narcissism.


----------



## Yorgos (May 13, 2015)

RejZoR said:


> Intel had GPU's for ages, but they were absolute garbage until AMD forced them to do something. They are still rather rubbish, but at least they improved them significantly.


the iGPU from i7-4700QM that I have in my hands is a little better than the one which was in my Foxconn motherboard with my Pentium D. I say a little better because benchmarks do not show the stability of the driver, even on Dota 2 I get an error at the start of every game, which means that I have to restart the game after the game matching. When I switch some programs/games from the iGPU to the dGPU (750m) then everything works as intended)
OTOH my amd-7850k integrated GPU is behaving as a dGPU which means that I have full support for all the goodies that a dGPU has, unlike the intel iGPU which supports several features from OpenGL (I am running on linux) and is making many programs to crash or misbehave.
On the other APU based laptop that I have, it was dirty cheap with an underpowered APU at 15Watts but every time I use a program that runs on openGL then it behaves flawlessly.
It doesn't matter how big the next iGPU will be, or how many frames there going to be, the fact is that intel does not care enough for their iGPUs to have the full-featured.
I laugh at people owing Macraps that run on Intel w/o a dGPU, and they are stuck with a shitty gpu for ever.


----------



## axxo22 (May 13, 2015)

lilhasselhoffer said:


> In the span of less than a decade Intel has gone from buggy trash to competent for business use. That's a huge leap, considering AMD and Nvidea took much longer to get there. If you're in doubt of this argument I'd ask you to compare Duke Nukem 3d to Hitman: Blood Money. That's one decade of difference, and you've still got some janky, boxy figures. In comparison the Sandy Bridge iGPU (2011) has already gotten to competent 1080p playback and it's only from 2014.



Two words: Transistor Density.


----------



## lilhasselhoffer (May 13, 2015)

axxo22 said:


> Two words: Transistor Density.



Another two words.  Fact check.

Transistor count AMD K5 (1996): 4,300,000 - 251 mm^2
Transistor count AMD K10 (2007): 463,000,000 - 283 mm^2

Transistor count Core i7 (2011): 1,160,000,000 - 216 mm^2 (total count)
Transistor count Core i7 (2014): 1,400,000,000 - 177mm^2 (total count)

Assuming that the AMD CPU transistor count mirrors that of a GPU (it's a stretch, but makes things easier), a 100 fold increase leads to a respectable increases in graphical fidelity.

Let's assume that the Intel offering has a 20% increase in the transistor count (dedicated to IGPU), and it's initially 20% of the transistors.  336000000-232000000 = 104000000 => 9% increase in transistor count.


You're telling me that a 10,000% increase in transistor count is comparable to a 9% transistor count increase.  Seriously?  Transistor density is important, but this is just silly.  Even if you add in architectural improvements, transistor count isn't some magic stick to wave around and claim means everything.


Sources are always good:
http://www.notebookcheck.net/Intel-HD-Graphics-Sandy-Bridge.56667.0.html
-
http://www.anandtech.com/show/7003/the-haswell-review-intel-core-i74770k-i54560k-tested/5
-
http://www.wagnercg.com/Portals/0/FunStuff/AHistoryofMicroprocessorTransistorCount.pdf


----------



## axxo22 (May 14, 2015)

lilhasselhoffer said:


> Another two words.  Fact check.
> 
> Transistor count AMD K5 (1996): 4,300,000 - 251 mm^2
> Transistor count AMD K10 (2007): 463,000,000 - 283 mm^2
> ...



It's not the _increase_ in transistor count that matters. It's the transistor count.

Imagine what AMD could have achieved with that density in 2007.


----------



## techy1 (May 14, 2015)

My only reason why I am pissed is - when Intel releases new CPU, all tech sites and forums praises their iGPU gains - why???  (I did not argue that there is no iGPU - there is always +30% gains).  Do you really need that product at all? I mean you are tech people - why you give Intel publicity and encouragements for product you will never use ( "never" -in next 5-10 year time span) if not - then shut up and "booo" Intel for tinny CPU gains,  because it is been a 2 years since i7-4770k - I bet the new i7-6700k will be like +10% increase in CPU department - is that good for you is that something that Intel should be praised about??? Or tell me - will you upgrade your CPU and platform for those iGPU gains only?


----------



## haswrong (May 14, 2015)

techy1 said:


> will you upgrade your CPU and platform for those iGPU gains only?



dont give them money until they come up with an acceptable solution.. you are the customer, you drive them..


----------



## THU31 (May 15, 2015)

My God, this is so pathetic. Seriously, how many unlocked CPU users need the iGPU? I am so sick of Intel's bullshit decisions. the iGPU takes up more than half the die, and most people have no use for it.


----------



## Frick (May 15, 2015)

Harry Lloyd said:


> My God, this is so pathetic. Seriously, how many unlocked CPU users need the iGPU? I am so sick of Intel's bullshit decisions. the iGPU takes up more than half the die, and most people have no use for it.



They would have to make a a vastly different chip only to use it in two CPUs, which seems wasteful.


----------



## THU31 (May 15, 2015)

No, they just prefer to sell big chips without iGPUs for 500-1000 $.

Lynnfield was 296 mm2 and it used to cost 196 $. Ivy Bridge-E was 257 mm2 and it used to cost 583 $. IVB-E should have been the standard desktop CPU for S1155, but that would not have been right for Intel.

Broadwell and Skylake would be microscopic without the giant iGPUs (much smaller than 100 mm2), and they could cost less than 100 $. But instead they snap on that useless iGPU and make everyone pay for it, even if they have no use for it whatsoever.


----------



## lilhasselhoffer (May 15, 2015)

axxo22 said:


> It's not the _increase_ in transistor count that matters. It's the transistor count.
> 
> Imagine what AMD could have achieved with that density in 2007.



I'll quote a famous movie here.  "I don't think that word means what you think it means."

If you're arguing that transistor density, as a way to approximate transistor count and assuming planar transistors, is the reason Intel has bettered their graphics you are demonstrably incorrect.  I proved that a 9% increase in raw transistor count goes from meh graphical fidelity to very decent performance.  This change occured in less than 5 years, which means no substantial shift in software could account for the increased performance.   

Assuming that your argument is that simply throwing more transistors at the problem is the solution, that is also a demonstrably incorrect.  A 10000% increase in transistor count had us go from 2.5 dimensional characters to rough 3 dimensional characters.  This took a decade, patents limiting competition, and fundamentally new software.  



Where is your argument?  I cannot see any salient point, unless your argument is that Intel has discovered a fundamentally different way to build transistors in the last 5 years, which allow them to work much faster.  

As to AMD having that transistor count in 2007, who gives a crap?  That transistor count would have cost AMD an insane fortune.  As they had only recently acquired ATI, where was that money going to come from?  AMD made the only fiscally responsible choice, and they put out what they could afford.  It wasn't the fastest chip, it was a budget performer.  Phenom competed well enough with Core2, if only in the budgetary arena.  None of this even begins to address our current situation.





Harry Lloyd said:


> My God, this is so pathetic. Seriously, how many unlocked CPU users need the iGPU? I am so sick of Intel's bullshit decisions. the iGPU takes up more than half the die, and most people have no use for it.



"Most people" is a useless term.  You don't provide any view of what this elusive majority is.  You don't even cite a basic knowledge of the market.  Let's rectify that, for your sake, and those who espouse the same ideology.
http://www.macrumors.com/2014/11/07/apple-mac-us-pc-record/
-This article cites an IDC study, in which the top two PC creators are Lenovo and HP.
http://www.statista.com/statistics/233818/share-of-product-sales-in-total-sales-of-lenovo/
-This article shows that most of Lenovo's products are business oriented.  In decline, year after year, is the Desktop PC.  The desktop is where a GPU added would not significantly impact lifespan of the product, as they would be plugged in constantly.


It therefore is reasonable to conclude the following:
1) Most PC sales are driven by business users.
2) Most PC sales are of goods where increased battery life would be a big benefit.
3) iGPUs can save a substantial amount of energy, by removing a discrete GPU.
4) Intel's largest market is the one they cater to, and it is the market of business users.


It is understandable that a power user, on a desktop, would not want an iGPU.  It is not reasonable for Intel to design a radically different piece of silicon for markets that would not have enough sales to justify the expenditure.  It is not reasonable for business users to lug 10 pounds of battery around, so they can have a 3 hour battery life to look at a bunch of spreadsheets (business usage, in a nut shell).  Intel caters to their largest consumer base, they include an iGPU, and you have the chutzpah to say their decisions are "bullshit."  Perhaps a moment more introspection, and a bit less focus on what you believe you are entitled to, is something sorely lacking.  You may not like the suggestion, but people rarely like to be called on their entitlement.


----------



## peche (May 15, 2015)

techy1 said:


> I wish Intel would stop doing iGPUs... cuz I do not need them... I will always choose mid-top range discrete GPU (and still - even a low end GPUs are far more capable than iGPU)  - so that iGPU is waste of my money, time and TDP (Intel is like: "oh - look brand new iGPU +50% over previous gen, almost 4k capable - we did this all for you buddy, no matter that you do not and will not use iGPU - but our team worked 2 years (and in that time made only like 0-5% CPU gain) to give you this experience you will never use or need - lucky you, now pay extra for our useful work". I would choose, for that extra $ and extra TDP a +1% CPU power than +50% iGPU .. But there is no such a option (never was and never will be - and that sux).


agreed... i haven't used my integrated graphics, i wish intel make i7's without integrated graphics for desktop computers, or make both, integrated and non integrated, 


Prima.Vera said:


> Why the fk do I need strong GPU on a i7-K processor is beyond my comprehension! Usually people buying i7s are buying for gaming and multimedia. I only need very basic GPU and that's it. HUGE WASTE of transistors, therefore also big arse TDP. Seriously, sometimes I think those managers from Intel are worst than monkeys.


shared idea!!



Aquinus said:


> ...because it's a *mainstream *platform, skt1156/1155/1150 are all mainstream platforms with a full lineup of CPUs from entry to performance. Cool your jets and calm down. Maybe you should go work for them and design a new CPU if you shit don't stink. If you're really that bent out of shape about the platform then get a HEDT platform and stop complaining.


thats why i think they should offer the option for a i7 XX70K, XX90K or what ever skd "K"  without integrated  graphics ... just for gaming, also i5 's too... because of the mainstream use on gaming, 



TheinsanegamerN said:


> Go buy a six core i7 and stop complaining. Intel already made what you want. yeah, it cost way more, but guess what? a tiny market demands high prices. socket 1366 and 2011 exist for people like you.


false.. i dont need and not so much gamers needs a hexacore, could be nice to get the same mainstream processor with out the integrated graphics, that will lower down TDP and also temps,

i also recognize that mobile core processors take full advantage of integrated video, on several cases, 


Regards,


----------



## johnspack (May 17, 2015)

Ridiculous.  I don't need to pay for an iGPU.  I need threads to run virtual machines.  I also don't need a supercomputer to do that.  On my budget I'll have to go for a first gen hex xeon and a
Sabertooth or a Rampage III with usb3 and sata3.   Still 500+ US used.  Also ridiculous.....


----------



## Aquinus (May 17, 2015)

peche said:


> thats why i think they should offer the option for a i7 XX70K, XX90K or what ever skd "K" without integrated graphics ... just for gaming, also i5 's too... because of the mainstream use on gaming,


That still doesn't change the part where the gaming community is small and very much a niche market. They have nothing to gain by giving gamers a different CPU, when they'll buy the one that's already on the market. It's cost and simplicity. When everything is the same, everything tends to work well together. iGPU or not, it still offers functionality you wouldn't otherwise have and when you're not using it, it gets power gated, so it's not like it even contributes to the CPU's TDP when it's not in use. Then you run into the argument that this fellow brings up.


Harry Lloyd said:


> My God, this is so pathetic. Seriously, how many unlocked CPU users need the iGPU? I am so sick of Intel's bullshit decisions. the iGPU takes up more than half the die, and most people have no use for it.


More cores you want with that extra die space? Go HEDT. The fact is that most consumers do use the iGPU (I definitely do on my laptop for work,) and Intel isn't going to make their lives harder because people who are cheap and investing in a mainstream platform are being nit picky about features that typically are reserved for HEDT platforms (more cores, no iGPU, extra features). The simple fact is that 1150 as a platform doesn't change because you go from a Pentium to a k-edition i7. It's still the same platform, so you should expect the same kinds of features. It's like asking for an 6 or 8c CPU on FM3+ without an iGPU when you might as well just go AM3+.

Let's be realistic here, people are only complaining because their money didn't get them more CPU than they wish it did, which is a completely legitimate argument given AMD's lack of real competition, but it doesn't change the fact that it's still a mainstream platform and both AMD and Intel have up to 4 cores on their mainstream platforms. In reality, most people barely will use 2, forget 4 cores, except us people who work and play on PCs.



Spoiler



The market doesn't change because you wish it would. The reality is that there is no motivation to do what anyone here suggests from a marketing perspective. So it's unrealistic for Intel to change its ways unless there was a monetary reason to back it up.





johnspack said:


> Ridiculous.  I don't need to pay for an iGPU.  I need threads to run virtual machines.  I also don't need a supercomputer to do that.  On my budget I'll have to go for a first gen hex xeon and a
> Sabertooth or a Rampage III with usb3 and sata3.   Still 500+ US used.  Also ridiculous.....


I'm sorry if this comes off as rude, but the market doesn't care what your needs are. It cares if you have money to pay for the product. Intel would rather get a big business to invest in 100 CPUs than for you to invest in one or two. For such a budget, AMD gets you more. At least AM3+ can run ECC memory and can do PCI-E pass-thru without breaking the bank.


----------



## OneMoar (May 18, 2015)

if Intel ever decided to get serious about graphics neither nvida or AMD would stand a chance

its not a matter of not knowing how intel simply didn't care about the graphics segment but they have hit a wall with making faster x86 cores so now they are starting to focus on graphics
AMD and Nvidia better watch there backs in just 3 generation's intel now have a chip built from scratch capable of running with dedicated entry level cards. 
yall care to wager what they could do if they really wanted it ?


----------



## lilhasselhoffer (May 18, 2015)

OneMoar said:


> if Intel ever decided to get serious about graphics neither nvida or AMD would stand a chance
> 
> its not a matter of not knowing how intel simply didn't care about the graphics segment but they have hit a wall with making faster x86 cores so now they are starting to focus on graphics
> AMD and Nvidia better watch there backs in just 3 generation's intel now have a chip built from scratch capable of running with dedicated entry level cards.
> yall care to wager what they could do if they really wanted it ?



Short answer: Nothing. 

Long answer: Intel has one of two avenues to attack the graphics market.  The first is to either violate patent law or license out the patents both Nvidea and AMD hold.  The second avenue is to plow enough money into the market to fundamentally rework how the graphics market does business.

In the case of licensing, you're going to hit problems rather quickly.  Nvidea and AMD both have reasons to not want Intel competing in their market.  Nvidea may work with Intel, but that's because Intel has been neutered when it comes to competing in the GPU market.  AMD is competing with Intel in the low end CPU market, and they've seen attacks on the APU business by Intel already.  Neither company wants Intel to release an iGPU that competes with $150-200 discrete cards by removing the overhead between CPU and interface bus, this would cut their GPU business to shreds.  Even without touching the high-end GPU business, Intel could force Nvidea and AMD to be niche GPU producers.  Neither company could afford for Intel to steal their mid level GPU business.

As far as patent infringement, that would not go well.  There's very little reason to look at Intel and see anything but a lawsuit pinata.  They've been caught in unethical and illegal business practices.  They've been investigated for being an illegal monopoly.  They've been sued by enough patent trolls to make the act of having an original thought at Intel akin to creating weaponized influenza (nobody wants to do it, given the horrible implications).  Intel's legal team would kibosh the idea of patent infringement so fast that the person who suggested it wouldn't be able to remember how the pink slip was issued to them.


In the case of the later, Intel has almost nothing to gain.  Let's hypothetically say Intel does this.  They invest ten tears of research (the existing 5 where they've focused on iGPU and another 5 to develop competing patentable technologies), untold bilions of dollars, and they come out with a GPU that competes with mid tier discrete cards.  Who buys it?  Businesses look at the expense being close to a CPU + discrete GPU, and they ask why not just get a discrete GPU they can upgrade.  Performance users (video editing, gaming, etc...) buy a higher tier video card to begin with, making the iGPU a transistor usage waste.  The only way to recoup all that cost is to release their own line of video cards.  Honestly, why?  Intel would be going into a mature market with no discernable advantages.  They'd be competing in a market that already has saturation with AMD and Nvidea.  In order to get payback from their investment they'd have to not only steal market share, but release new products.  Imagine a lawyer being advised that Intel was potentially working on another monopoly with the GPU business.  Intel would be insane to do this, as risk so far outweighs reward on this proposition that nobody in their right mind would say yes.




TL;DR
Intel isn't trying to compete with Nvidea and AMD in the GPU market.  They're trying to make business users happy, by having an integrated GPU capable of basic tasks.  The point of a discrete GPU isn't about basic tasks, so conflating Intel's goals with dominating the GPU market is a bit silly.  While the iGPU has made amazing strides recently, it isn't changing the world; Intel's improved iGPU is making it easier for business users to have a device that doesn't break the bank, but still does enough to cover their needs.




Edit:
Minor spelling changes.  Curse you English, and my sometimes tenuous grip over you.


----------



## Aquinus (May 18, 2015)

OneMoar said:


> if Intel ever decided to get serious about graphics neither nvida or AMD would stand a chance
> 
> its not a matter of not knowing how intel simply didn't care about the graphics segment but they have hit a wall with making faster x86 cores so now they are starting to focus on graphics
> AMD and Nvidia better watch there backs in just 3 generation's intel now have a chip built from scratch capable of running with dedicated entry level cards.
> yall care to wager what they could do if they really wanted it ?


Except the part where Intel doesn't have rights to use shader technology which is why Intel's GPUs look a lot like chopped down x86 cores. I think if Intel ever replaced how they do graphics, a lot of good would come out of it.

With that said Iris Pro is capable of pumping out about 832GFLOPS of raw compute power which is about what each of my 6870s can do, which is *very* close to the 1TFLOP they really do, which is pretty good for integrated graphics. It makes me wonder if an overclock on one who those new C-edition CPUs can get that to 6870-like performance.



lilhasselhoffer said:


> TL;DR
> Intel isn't trying to compete with Nvidea and AMD in the GPU market. They're trying to make business users happy, by having an integrated GPU capable of basic tasks. The point of a discrete GPU isn't about basic tasks, so conflating Intel's goals with dominating the GPU market is a bit silly. While the iGPU has made amazing strides recently, it isn't changing the world; Intel's improved iGPU is making it easier for business users to have a device that doesn't break the bank, but still does enough to cover their needs.


I'm glad that at least one person understands what's going on here.


----------



## OneMoar (Jun 7, 2015)

bahahhaha HD6200 walking all over AMD's APU's
muahahahahhahaha
so if intel where to make a graphics core a mear 20% faster then the 6200 (witch they could do very very easily) they would basically render AMD AND nvidia's entire entry level dedicated card/IGPU/APU solutions worthless 
so again ill restate it if Intel deemed so they are perfectly capable or putting amd out of the market segment
and if they wanted to go in for the kill they could release a quad core cpu with 4670k level performance for <150.00 and that right there would be the end of AMD


----------



## Aquinus (Jun 7, 2015)

OneMoar said:


> bahahhaha HD6200 walking all over AMD's APU's
> muahahahahhahaha
> so if intel where to make a graphics core a mear 20% faster then the 6200 (witch they could do very very easily) they would basically render AMD AND nvidia's entire entry level dedicated card/IGPU/APU solutions worthless
> so again ill restate it if Intel deemed so they are perfectly capable or putting amd out of the market segment
> and if they wanted to go in for the kill they could release a quad core cpu with 4670k level performance for <150.00 and that right there would be the end of AMD


We haven't even seen how the iGPU on the new C edition CPUs handles compared to what we have now and there are a couple things that may change the game now. Take the eDRAM cache, can it overclock? DDR4, how will it impact iGPU performance over DDR3 now? ...and the iGPU cores themselves, how do they overclock? You throw all of that together and there is a big unknown as to how it will perform.

I can't wait to see a review.


----------



## mcraygsx (Nov 2, 2015)

techy1 said:


> My only reason why I am pissed is - when Intel releases new CPU, all tech sites and forums praises their iGPU gains - why???  (I did not argue that there is no iGPU - there is always +30% gains).  Do you really need that product at all? I mean you are tech people - why you give Intel publicity and encouragements for product you will never use ( "never" -in next 5-10 year time span) if not - then shut up and "booo" Intel for tinny CPU gains,  because it is been a 2 years since i7-4770k - I bet the new i7-6700k will be like +10% increase in CPU department - is that good for you is that something that Intel should be praised about??? Or tell me - will you upgrade your CPU and platform for those iGPU gains only?




Well said Sir. Hats off to you.


----------



## peche (Nov 2, 2015)

OneMoar said:


> if Intel ever decided to get serious about graphics neither nvida or AMD would stand a chance


well....


----------



## R-T-B (Nov 4, 2015)

Everyone moaning about wanting an intel CPU without graphics should look to Haswell-E.


----------



## peche (Nov 4, 2015)

R-T-B said:


> Everyone moaning about wanting an intel CPU without graphics should look to Haswell-E.


not the same... is pretty expensive.....


----------



## mcraygsx (Nov 4, 2015)

peche said:


> not the same... is pretty expensive.....



Expensive and its not as energy efficient But the iGPU on 6770k is a complete waste for most.


----------



## Aquinus (Nov 4, 2015)

mcraygsx said:


> Expensive and its not as energy efficient But the iGPU on 6770k is a complete waste for most.


When fully loaded, sure. The thing is that even my 3820 doesn't consume much power at idle unless I'm overclocking it in a way that has power saving essentially disabled (bclk straps seem to override speedstep on my machine.) Also it's not that the CPUs are expensive. I got my 3820 brand new when SB-E was the latest and greatest for 300 USD which was cheaper than the 2600k and 2700k. The problem was that the cost of motherboards on skt2011 was significantly more expensive, so for me since I was already spending considerably more on the motherboard, I went all out and the the P9X79 Deluxe after talking to @cadaveca about it several years ago.

Either way, when my machine is completely at stock, pulls 150 watts from the wall, which is pretty low if you consider low efficency due to a small draw on a big power supply and the fact that I have 5 HDDs and 2 SSDs and 7 case fans getting driven. So if you put it all together, even my lowly 3820 in reality sips power when it's doing nothing... but even at full load I would expect a CPU with twice as many memory channels, 2MB more L3 cache, and 24 more PCI-E lanes than its 1155 counter part, you're damn straight it's going to use a little more power. The point is that it's a lot less than you think it is though and the dynamic changes *when you actually use it*.

tl;dr: skt2011 and 2011-3 are more power efficient that you think, particularly when the thing is idling... but yeah, twice as many PCI-E lanes, memory channels, and 2MB more L3 cache (quads,) is going to draw more power, it's more hardware to drive unlike the idle iGPU on mainstream CPUs.


----------



## EarthDog (Nov 4, 2015)

Run cooler?

Let me ask this: what has a higher temperature... a lighter with a yellow flame, or a bonfire with yellow flames?


(Answer: they are both the same temperature, but the bonfire clearly has more energy!)

Oh lilhasselhoffer...................... LOL!


----------

