Tuesday, May 12th 2015
95W TDP of "Skylake" Chips Explained by Intel's Big Graphics Push
Intel's Core "Skylake" processor lineup, built on the company's swanky new 14 nanometer fab process, drew heads to its rather high 95W TDP for quad-core parts such as the Core i7-6700K and Core i5-6600K, even though their 22 nm predecessors, such as the i7-4770K and the i5-4670K run cooler, at 84W TDP. A new leaked slide explains the higher TDP. Apparently, Intel is going all-out with its integrated graphics implementation on Core "Skylake" chips, including onboard graphics that leverage eDRAM caches. The company is promising as much as 50% higher integrated graphics performance over "Haswell."
Although the chips have high rated TDP, the overall energy efficiency presents a different story. SoCs based on "Skylake" will draw as much as 60% lower power than "Haswell" based ones, translating into 35% longer HD video playback on portable devices running these chips. Intel's graphics performance push is driven by an almost sudden surge in display resolutions, with standards such as 4K (3840 x 2160) entering mainstream, and 5K (5120 x 2880) entering the enthusiast segment. Intel's design goal is to supply the market with a graphics solution that makes the two resolutions functional on desktop and video, if not gaming.
Source:
AnandTech Forums
Although the chips have high rated TDP, the overall energy efficiency presents a different story. SoCs based on "Skylake" will draw as much as 60% lower power than "Haswell" based ones, translating into 35% longer HD video playback on portable devices running these chips. Intel's graphics performance push is driven by an almost sudden surge in display resolutions, with standards such as 4K (3840 x 2160) entering mainstream, and 5K (5120 x 2880) entering the enthusiast segment. Intel's design goal is to supply the market with a graphics solution that makes the two resolutions functional on desktop and video, if not gaming.
72 Comments on 95W TDP of "Skylake" Chips Explained by Intel's Big Graphics Push
So literally, thank you AMD for making this day happen. Now you just better deliver with Zen or you'll have no way to flaunt your APUs in front of Intel anymore.
Granted, the power of those GPUs wasn't that good up until HD2500, and with what i persume as a R7 250-like performance on top skylake CPUs, the simple assumption of "all APUs have better graphics horse power than any intel CPU" is goign to change. A lot.
Competition is good and when you've got AMD gloating about having discrete level graphics on a CPU, it's like a little monkey poking a gorilla with a tiny poop stick. It might not notice right away but eventually its gonna turn around and swat it.
Before APUs, the last time I remember Intel being proud of a new GPU was back with the GMA950 and that was a laughing stock when it launched. Now since APUs, they've been actually putting forth effort. I honestly was impressed the HD4000 in my laptop actually could run some things. I was thinking it would be utter crap compared to the 660M in it, and well it is, but it was better than expected.
They can't rival AMD or nVidia on high end but its good to see them finally putting forth effort into graphics. Just imagine if Intel got real serious about graphics and decided to enter the GPU war. They got the R&D, fabs, and all the brains to be able to do it.
A family of APUs and other of CPUs would make more sense.
Also Iris Pro can drive 4k video on a 4k display. I know because I've done it and it works great. Is it good for gaming, not really, but is it good for everything except gaming or maybe even a bit of light gaming? Sure. Just remember, Intel makes most of its money off businesses, not individual consumers, so it only makes sense that their products reflect the market share and a huge chunk of the market uses iGPUs or has no use for discrete graphics.
Another way of putting it is: Enthusiasts are a very small niche and most "enthusiasts" don't even want to spend much money anyways, so you'd be stuck with a mainstream platform anyways by virtue of your budget. So no, this is Intel making a one-size-fits-all. If you're not happy with it, you have options, it's called Haswell-E. I felt a similar way when I upgraded 3 years ago, hence why I have a 3820 and not a 2600k.
:laugh::toast::D
When did I say I don't like the features of Intel integrated GPU, I said "kinda feels like a waste when most PC enthusiasts will be using a dedicated GPU but then again I can see why too, Intel wanna compete & be above AMD on all levels" See! I said I understand why they're doing it. :slap: