# Intel Core i7-4770K 'Haswell' HD Graphics 4600 GPU Performance



## W1zzard (Jun 1, 2013)

Today, Intel released their new Haswell processors which include improvements to the integrated graphics core. We test 17 of the latest games to investigate whether the Intel HD Graphics 4600 is of any use for serious PC gaming.

*Show full review*


----------



## Protagonist (Jun 1, 2013)

Interesting though when i get i7-4770 non K i will immediately plug back my GTX670, very nice review W1zzard

Though you might want to correct



> As we can see, the sweet spot looks to be at 1600 MHz, which is quite cheap nowadays. If you want to spend a bit more money, *1833* MHz seems reasonable, but anything beyond is just money spent on almost no improvement.



To 1866


----------



## Supercrit (Jun 1, 2013)

I know that the review is specifically targeted at the GPU part, but the poor performance should be expected no? I hardly call that a con. After all you don't buy a 4770k for the graphics.

Glad that these waste of resource graphics card like 520 and 6450 will go extinct.


----------



## vinibali (Jun 1, 2013)

how did u get the A10-6800K?


----------



## pjl321 (Jun 1, 2013)

can you still use stuff like QuickSync if you have a discrete AMD or nVidia card installed?


----------



## Frick (Jun 1, 2013)

This bit here: 



> For the mobile segment, Intel has added a third tier of processor graphics based on a new GT3 graphics core called "Iris Pro" and "Iris", but these are only available on the BGA socket, so they can't be used with LGA1150 motherboards. I think the logic behind these is that traditional desktop users will use a real graphics card for serious gaming performance, but integrated GT2-based graphics—HD Graphics 4600, 4400, and 4200—should suffice for normal desktop work, videos, and light-gaming usage.



Is actually pretty sad, because those things are faster than Trinity. In some cases it's on par with the Geforce GT640.


----------



## pjl321 (Jun 1, 2013)

> Just like on Sandy Bridge and Ivy Bridge, Intel has integrated a quad-core CPU



yeah and the same as a million other chips since way back in 2006 with the quad-core Kentsfield QX6700, ok so this wasn't fully integrated but my point is its been 7 years since the first Intel x86 quad-core, its time to step it up now. 8-core should have been the mid-range default by now, we are not really pushing frequencies up like we used to back in the old days and we seem to have stopped increase core-count too, and Intel wonders why people aren't buying as much and its profits are now?!

Give us something to be excited about and we'll put our hand in our pockets!

The QX6700 was made using 65nm, we have had 45nm, 32nm and now 22nm, QX6700 was over twice the size of the 4770k even taking into account the massive on-board GPU. Couple this with the massive improvements in thermal management and we should have moved on in the last 7 years.

Give us octa-core already!


----------



## dude12564 (Jun 1, 2013)

pjl321 said:


> can you still use stuff like QuickSync if you have a discrete AMD or nVidia card installed?



Maybe the motherboard vendors will include Lucid MVP.


----------



## Aquinus (Jun 1, 2013)

pjl321 said:


> yeah and the same as a million other chips since way back in 2006 with the quad-core Kentsfield QX6700, ok so this wasn't fully integrated but my point is its been 7 years since the first Intel x86 quad-core, its time to step it up now. 8-core should have been the mid-range default by now, we are not really pushing frequencies up like we used to back in the old days and we seem to have stopped increase core-count too, and Intel wonders why people aren't buying as much and its profits are now?!
> 
> Give us something to be excited about and we'll put our hand in our pockets!
> 
> ...



...and how much software does the average (and even gamer,) use that actually can harness the power of an 8 core processor? They're not doing it because the only demand for it are from people like you who know what they really need from a CPU or people who really do need more cores. There aren't many situations where I would see a difference in performance going from my 3820 to a 3930k and it's not every day that I'm encoding video or running something that can fully utilize 8 logical cores.

Intel is giving the general market exactly what it wants, to be completely honest. It's important to remember that us members at TPU and our wants don't always coincide with the general market.

I think Intel did fine and I think your scrutinization of Intel and Haswell is unfounded.


----------



## Vulpesveritas (Jun 1, 2013)

Aquinus said:


> ...and how much software does the average (and even gamer,) use that actually can harness the power of an 8 core processor? They're not doing it because the only demand for it are from people like you who know what they really need from a CPU or people who really do need more cores. There aren't many situations where I would see a difference in performance going from my 3820 to a 3930k and it's not every day that I'm encoding video or running something that can fully utilize 8 logical cores.
> 
> Intel is giving the general market exactly what it wants, to be completely honest. It's important to remember that us members at TPU and our wants don't always coincide with the general market.
> 
> I think Intel did fine and I think your scrutinization of Intel and Haswell is unfounded.



Actually an octa-core design might be a good idea overall for furthering performance on titles that are next-gen console ports, what with their being likely octa threaded made for a processor much weaker per core but said 8 cores, what with both high end next-gen consoles running on octa-core x86 processors.


Also I noticed in that Anand bit that the i7's with GT3 have twice the power consumption of an AMD A10, to gain a 25% speed advantage in gaming to AMD, go figure.  Cant' wait for Kaveri.


----------



## Bale11 (Jun 1, 2013)

I think we should not blame software's developer about the lack of the support of 4 cores and up, because as we all know, Intel as the leader and dominant on the CPU market, they are making deals with software's developer to slow down the wheel of innovation progress at least in desktop segment, we see that the majority of windows application are single threaded and use in extreme conditions 4 cores, and we know that multi threaded apps is the future and here where AMD CPUs shine so as long intel is the leader nothing will be changed.
and sorry for my bad english


----------



## pjl321 (Jun 1, 2013)

Aquinus said:


> ...and how much software does the average (and even gamer,) use that actually can harness the power of an 8 core processor? They're not doing it because the only demand for it are from people like you who know what they really need from a CPU or people who really do need more cores. There aren't many situations where I would see a difference in performance going from my 3820 to a 3930k and it's not every day that I'm encoding video or running something that can fully utilize 8 logical cores.
> 
> Intel is giving the general market exactly what it wants, to be completely honest. It's important to remember that us members at TPU and our wants don't always coincide with the general market.
> 
> I think Intel did fine and I think your scrutinization of Intel and Haswell is unfounded.



I'm not dismissing Haswell, it looks like a good chip and from what i've seen will overclock like a female dog but with regards to the core count it is simply the chicken and the egg scenario. People said dual core wasn't needed and Windows would never advantage of it properly, then people said quad core was massive overkill and there is no software out there that the common man would use that would stress 4 cores.

I accept that most software out there isn't set up for Octa-core but as always make the chips and developers will make the software to take advantage of it. Not to mention the PS4 and Xbox One are both octa-core so games will be designed to use 8 cores from here on in.


----------



## Jstn7477 (Jun 1, 2013)

My 3770K already has 8 threads (just Hyperthreading though) yet none of the games I play use more than 4 threads in HWiNFO64 overlayed into my games using RivaTuner Statistics Server. In fact, to get rid of my CPU limitation in many games, I would need four faster cores more than this 8 core nonsense. 6/8 cores is useful for streaming your live gameplay to TwitchTV, but other than that, they are currently useless.


----------



## W1zzard (Jun 1, 2013)

Bale11 said:


> we see that the majority of windows application are single threaded



that's because the majority of windows applications dont need any significant cpu performance. when was the last time you waited for a desktop application (and it was not waiting for disk or network access to complete) ?


----------



## erocker (Jun 1, 2013)

Supercrit said:


> After all you don't buy a 4770k for the graphics



If the graphics were better I'd be buying more than one. Really, it would be great if we could get the CPU power of Haswell with, say the GPU from an AMD 6800K all in one.  Really though if the GPU in Haswell is good enough for high-def video than really, it's good enough for me.


----------



## cheesy999 (Jun 1, 2013)

W1zzard said:


> that's because the majority of windows applications dont need any significant cpu performance. when was the last time you waited for a desktop application (and it was not waiting for disk or network access to complete) ?



This, Upgrading to a solid state drive made a bigger difference to my computer then anything else has done in a long time. Even if my solid state drive is an older cheap one.

HD 3000 graphics will actually run most modern games on low, so now they've improved the performance even more this very well gives you the ability to most games on a laptop with several hours on runtime

In short: HD 4600's will be useful because now you can play full pc games on the train etc


----------



## Jstn7477 (Jun 1, 2013)

At least the graphics are there in case your video card goes up in smoke like my spare GTX 460 did while my first HD 7950 was out for RMA.


----------



## JDG1980 (Jun 1, 2013)

I'd be interested to know if they finally got 23.976 fps refresh rates right this time. That was one of the biggest roadblocks to using an Intel chip for a HTPC in previous generations.


----------



## Hayder_Master (Jun 1, 2013)

WTF ? W1zzard make GPU test before other websites do a CPU test, LOL awesome


----------



## VulkanBros (Jun 1, 2013)

cheesy999 said:


> This, Upgrading to a solid state drive made a bigger difference to my computer then anything else has done in a long time. Even if my solid state drive is an older cheap one.



+1 for that


----------



## XtremeCuztoms (Jun 1, 2013)

Very Nice W1zzard !!


----------



## Brusfantomet (Jun 1, 2013)

Gives a precursor to how big the performance change in the x86 cores are.

But nice to see Intel taking AMD seriously (albeit it the GPU department)

but as others have noted already, wait a year and 8 cores will come, as both the mayor consoles (PS and Xbox, nintendo does not play the specs game) have eight cores (albeit jaguar aka low power cores). well, half of TPU already had 8+ cores (Intel I7 and Amd bulldozers) so the only question i am left with is: Is it worth upgrading from my I7 920? This could easily be answered with a i7 920 in the comparison pool when the review of the CPU performance of the 4770K is done, also, i hope i am not the only one that want to know if Haswell cpus have TIM or soldier between the Silicon and the heat spreader.

And as always, a pretty good review here at TPU. Looking forward to the CPU part.


----------



## Sasqui (Jun 1, 2013)

I love TPU.  I'm glad I have a nicely overclocked 3750K CPU and 7870 GPU, works for me


----------



## Aquinus (Jun 1, 2013)

Sasqui said:


> I love TPU.  I'm glad I have a nicely overclocked 3750K CPU and 7870 GPU, works for me



Here here! My 3820 and 6870s work nicely for me as well. Just because it's not the fastest doesn't mean it won't perform just as well for the things you need it to do.


----------



## diopter (Jun 2, 2013)

It looks like you guys made a calculation error in your summarised performance analysis. 

If i7-4770K is set at 100%, and i7-3770K is set at 75%, then the i7-4770K is not 25% faster. It's 33% faster. (100/75 = 1.3333)


----------



## Delta6326 (Jun 2, 2013)

Wow I must be getting negative scaling on Grid 2 I'm only getting about 32fps... 2x 4850

Performance is still lacking, but it will be very interesting to see BGA performance.


----------



## Recus (Jun 2, 2013)

pjl321 said:


> yeah and the same as a million other chips since way back in 2006 with the quad-core Kentsfield QX6700, ok so this wasn't fully integrated but my point is its been 7 years since the first Intel x86 quad-core, its time to step it up now. 8-core should have been the mid-range default by now, we are not really pushing frequencies up like we used to back in the old days and we seem to have stopped increase core-count too, and Intel wonders why people aren't buying as much and its profits are now?!
> 
> Give us something to be excited about and we'll put our hand in our pockets!
> 
> ...


----------



## Thefumigator (Jun 2, 2013)

Haswell is an improvement. As simple as that. I like the power consumption improvement, and I still dislike its 3D performance despite being better than ever for intel. 

The simple fact there are other products around like trinity with better 3D performance in one chip spoils a bit the whole product but on the other side, if you get haswell for its x86 performance, which is great performance indeed, I don't really get why is the GPU integrated into the chip (apart from making quicksync work). I don't really get it, I mean, top of the line FX processors lacks GPU because you don't want these. So I still believe intel should take that thing out in my opinion.

Anyway the Intel beast gets closer on the GPU side, and closer every day, someday it may catch.


----------



## lordz (Jun 2, 2013)

Why are the 5800k and 6800k being ran on 1600mhz ram when testing their IGP?
Its well known that these boards support much faster ram and the integrated graphics on these chips perform drastically better on 1866 or 2133mhz.

If you were building any of these systems for their IGP value you would run the AMD's on the correct speed ram and if this had been done then there would be another 10fps on the AMD chips making them far superior than the intel's HD4600.


----------



## lordz (Jun 2, 2013)

I should clear it up a little bit, i understand they used 1600mhz ram for both systems to be fair, but really they should have ran them BOTH at 1866 or 2133. so that the IGP is actually showing what i can do.

Its kind of like comparing a titan and a gtx 560 with both systems running a celeron. Its going to be a limitation before the GPUs hit their limits.


----------



## Thefumigator (Jun 3, 2013)

lordz said:


> I should clear it up a little bit, i understand they used 1600mhz ram for both systems to be fair, but really they should have ran them BOTH at 1866 or 2133. so that the IGP is actually showing what i can do.
> 
> Its kind of like comparing a titan and a gtx 560 with both systems running a celeron. Its going to be a limitation before the GPUs hit their limits.



I agree, but the radeon on 1600 is still better than the intel IGP, so I can imagine that at 2133 the radeon will improve like 10 to 20% its performance, so should do the intel IGP, but that's just an hypothetical argument, it would be nice to compare IGPs by its ram speed and see what happens to both.


----------



## lordz (Jun 3, 2013)

Agreed, although seeing the intels rated to 1600mhz ram i dont know that there will be much improvment.
It appears amd still are king at IGP


----------



## Cheeseball (Jun 3, 2013)

JDG1980 said:


> I'd be interested to know if they finally got 23.976 fps refresh rates right this time. That was one of the biggest roadblocks to using an Intel chip for a HTPC in previous generations.



This might be of interest to you and me.

EDIT: LOL, I just saw that you commented on that article. Heh.


----------



## Casecutter (Jun 3, 2013)

This whole "gaming exercise" is just redeeming futility.... sure fun to know they can now match/beat the low-end card from 2 years ago. (I'll ask was that 6450 sporting DDR5 or DDR3?)  It's just bizarre to explore the GPU performance of an unlock (84W) $340 CPU/IGP, for its' modern gaming merits!  What amounts to Intel's benevolent offering of a GPU/IGP, just so you can run it without your discrete graphics card for either initial set-up or diagnose without it a card.  Sure it make sense if this was an i3 box to see if your pre-teen might play something, but again these titles, it is a lesson in futility.  While who today would buy a new OEM box and wouldn’t end-up pairing it with a 1920x1080 monitor, and while sure you could set the resolution lower, where’s the fun in that.


----------



## Renald (Jun 3, 2013)

I see no point in this test ....

People who used these chips to play, don't put AA and games with a lot of shaders. They play CS or WoW without High graphics. Surely, ICP will fail on hard requirments, no need to test it on 20 games.

The question of these IGP are : "Can I play most of current games in medium settings ?" not "Am I going to reach 1235692 FPS on Tomb Raider 5760*1080 with TressFX activated ?".

Most of the test here are often useful and well-driven, but this one is totally useless. It totally misses the point of IGP.


----------



## REDHOTIRON2004 (Jun 4, 2013)

*APU and GPCPU is the future not a discrete CPU/GPU!*



Renald said:


> I see no point in this test ....
> 
> People who used these chips to play, don't put AA and games with a lot of shaders. They play CS or WoW without High graphics. Surely, ICP will fail on hard requirments, no need to test it on 20 games.
> 
> ...



Looking at the future of computing. I will say that you are wrong about the IGP's or ALU's from Intel and AMD respectively. 
I believe that with the miniaturization of the complete system. The GPU and CPU's would be one component that would handle both games and general purpose requirements in a heterogeneous way. 
So, for example in case you want to upgrade your CPU or GPU. Then you would just upgrade to a APU which would deliver the same performance as a discrete level GPU presently.

Also, in the future APU are going to be cross-fired/SLI-ied to increase the overall performance 2twice or thrice. 
Integrated graphics on CPU are not strong at present. But, with the introduction of heterogeneous computing(which is gaining momentum). Separate GPU's and CPU's would become obsolete! 
And only the GPCPU/APU would be, what anyone else would look out for!


----------



## pjl321 (Jun 4, 2013)

REDHOTIRON2004 said:


> Looking at the future of computing. I will say that you are wrong about the IGP's or ALU's from Intel and AMD respectively.
> I believe that with the miniaturization of the complete system. The GPU and CPU's would be one component that would handle both games and general purpose requirements in a heterogeneous way.
> So, for example in case you want to upgrade your CPU or GPU. Then you would just upgrade to a APU which would deliver the same performance as a discrete level GPU presently.
> 
> ...



I agree up to a point, you only have to look at some of the handbrake or Adobe numbers that are optimised for OpenCL to see the future of high performance computing is heterogeneous but they will never put a heavily overclocked 8-core x86 (+200w) plus a state of the art GPU (+300w) onto one tiny little die. Ivy Bridge and Haswell has shown us as we shrink transistors we create more heat problems than we solve, and that is at a poxy 84w! Ok so the problems don't start at 84w but you can't overclock any 22nm chip very much without massive heat density issues. This will only get worse as we shrink further.


----------



## REDHOTIRON2004 (Jun 5, 2013)

pjl321 said:


> I agree up to a point, you only have to look at some of the handbrake or Adobe numbers that are optimised for OpenCL to see the future of high performance computing is heterogeneous but they will never put a heavily overclocked 8-core x86 (+200w) plus a state of the art GPU (+300w) onto one tiny little die. Ivy Bridge and Haswell has shown us as we shrink transistors we create more heat problems than we solve, and that is at a poxy 84w! Ok so the problems don't start at 84w but you can't overclock any 22nm chip very much without massive heat density issues. This will only get worse as we shrink further.



Just some days back, I was going through some articles that mentioned the kind of miniaturization that is being targeted by companies like Intel, IBM etc. And you might be surprised to know that Intel is already having prototypes for less than 10nm process. And we would start to see 14nm shortly within 6months from now.  And 10nm(by using new materials like graphene, silicate etc capable of running in several Terahertz frequency instead of GHz as we have now) in 2014. They are targeting, to take it down to 2nm. And according to some findings/calculations even if these companies would reach 1nm process. Even then there would be a huge opportunity for reduction as the atoms are still much smaller than 1nm(probably quantum computing would kick-in .
And at 2nm the number of transistors that can be accommodated on these chips would theoretically increase the performance by 300times when compared to 32nm process.
So, a powerful CPGPU can easily be made at that level of miniaturization. And more power can be gained through SLI/cross-firing those CPGPU's.
The heat issue would be tackled by using more efficient materials such as the one I mentioned above at lower manufacturing process. And of-course we would see some more innovations during these years which would improve the architecture/efficiencies even more.

According to me, the GPU concept would vanish in a couple of years from now. As the new breed of much powerful APU's(with huge number of cores) would take care of both CPU and GPU(games) tasks with the capability for scaling simultaneously/parallel(as GPU) and sequentially(as CPU)!


----------



## pjl321 (Jun 5, 2013)

REDHOTIRON2004 said:


> Just some days back, I was going through some articles that mentioned the kind of miniaturization that is being targeted by companies like Intel, IBM etc. And you might be surprised to know that Intel is already having prototypes for less than 10nm process. And we would start to see 14nm shortly within 6months from now.  And 10nm(by using new materials like graphene, silicate etc capable of running in several Terahertz frequency instead of GHz as we have now) in 2014. They are targeting, to take it down to 2nm. And according to some findings/calculations even if these companies would reach 1nm process. Even then there would be a huge opportunity for reduction as the atoms are still much smaller than 1nm(probably quantum computing would kick-in .
> And at 2nm the number of transistors that can be accommodated on these chips would theoretically increase the performance by 300times when compared to 32nm process.
> So, a powerful CPGPU can easily be made at that level of miniaturization. And more power can be gained through SLI/cross-firing those CPGPU's.
> The heat issue would be tackled by using more efficient materials such as the one I mentioned above at lower manufacturing process. And of-course we would see some more innovations during these years which would improve the architecture/efficiencies even more.
> ...




I possibly didn't make my point clearly, I am not saying that in the future you won't fit the power of today's top CPUs and GPUs onto one small chip as I know the pace tech evolves. What I am saying is there will always be limiting factors to integration, most likely this will be heat but soon we will see things like quantum tunneling being an issue.

But any new tech that comes out will be generally available for all. A CPU company will already push it's chips to the max but at the say time a GPU company will always push its chips to the max also. You can't expect to combine two things at their limits without something giving, i.e. making a compromise. If you have two separates then there is no compromise, therefore a much faster system and therefore separates will never completely die.

P.S.

Don't get too excited about the next big thing as it almost never actually comes and when it does its always a massive let down. The Pentium 4 was meant to ship at speeds of over 10GHz, Sandy Bridge was going to be an octa-core and Hitachi is 3 years late on its pledge on 5tb hdd! 

http://www.geek.com/chips/intel-predicts-10ghz-chips-by-2011-564808/

http://www.theregister.co.uk/2008/07/04/hitachi_5tb_hdd_2010/


----------



## George_o/c (Jun 11, 2013)

Such a pitty they limited the GT3 and made it unavailable for desktops... Maybe they could offer some more competition at AMD. Thanks a lot for the review!


----------

