# Core i5-3570K Graphics 67% Faster Than Core i5-2500K, 36% Slower Than GeForce GT 240



## btarunr (Feb 20, 2012)

An Expreview community member ran benchmarks comparing the performance of the Intel HD 4000 graphics embedded into its upcoming 22 nm "Ivy Bridge" Core i5-3570K, comparing it to the integrated graphics of Core i5-2500K, and discrete graphics NVIDIA GeForce GT 240. These tests are endorsed by the site. The suite of benchmarks included games that aren't quite known to be very taxing on graphics hardware by today's standards, yet are extremely popular; games such as StarCraft II, Left4Dead 2, DiRT 3, Street Fighter IV. Some of the slightly more graphics-intensive benchmarks included Far Cry 2 and 3DMark Vantage. All benchmarks were run at 1280 x 720 resolution. 

The Intel HD 4000 graphics core beats the HD 3000 hands down, with performance leads as high as 122% in a particular test. The chip produces more than playable frame-rates with Left4Dead 2 and Street Fighter IV, both well above 50 FPS, even DiRT 3 and Far Cry 2 run strictly OK, over 30 FPS. StarCraft II is where it produced under 30 FPS, so the chip might get bogged down in intense battles. A mainstream discrete GeForce or Radeon is a must. On average, the graphics core embedded into the Core i5-3570K was found to be 67.25% faster than the one on the Core i5-2500K. 



 

 

 




When pitted against a 2+ year old GeForce GT 240, the Core i5-3570K struggles. In StarCraft II, it's 53.64% slower. On average, the GT 240 emerged 56.25% faster. A decent effort by Intel to cash in on the entry-level graphics. We are hearing nice things about the HD video playback and GPU acceleration capabilities of Intel's HD 4000 core, and so there's still something to look out for. Agreed, comparing the i5-3570K to the i5-2500K isn't a 100% scientific comparison since the CPU performance also factors in, but it was done purely to assess how far along Intel has come with its graphics.

*View at TechPowerUp Main Site*


----------



## claylomax (Feb 20, 2012)

Do you think it can play Hard Reset maxed out at 1900x1200?


----------



## btarunr (Feb 20, 2012)

claylomax said:


> Do you think it can play Hard Reset maxed out at 1900x1200?



I don't think so. Hard Reset at that resolution, maxed out, can be sufficiently taxing on even $200 graphics cards.


----------



## OneCool (Feb 20, 2012)

Why is it being compared to nvidia not AMDs apu?

Doesnt make much sense to me.


----------



## Casecutter (Feb 20, 2012)

67.25% faster than that of a 2007 IGP is good?  Then to compare to a 2009 GT240 and probably a DDR3, if it is 36% is adequately respectable now in 2012.  So, it’s maybe like a HD5550, which could now almost achieving modern entry leval so not bad.


----------



## ZoneDymo (Feb 20, 2012)

OneCool said:


> Why is it being compared to nvidia not AMDs apu?
> 
> Doesnt make much sense to me.




maybe those comparisons hurt Intel's ego, I dont know.


----------



## THANATOS (Feb 20, 2012)

It looks very nice until you find out that HD 3000 has worse score than it should.
http://semiaccurate.com/forums/showpost.php?p=153673&postcount=23


----------



## THANATOS (Feb 20, 2012)

*btarunr* You made a mistake.


> On average, the GT 240 emerged 36% faster.


HD4000 is 36% slower than GT240 but on the other hand 100-36=64 -> 100/64=56.25
GT240 is 56.25% faster than HD4000.


----------



## dickobrazzz (Feb 20, 2012)

wow..beautiful!


----------



## WarraWarra (Feb 20, 2012)

The IGP graphics is okay for most office use or internet cafe internet surfing pc / headless server use. It helps if you use webcl.nokia.com and intel webcl software.
Still have no clue why Intel is not involved in some scam with a IGP company to force the use of their IGP's vs Intel wasting time on IGP's.

The best part is when you run a i7 cpu with IGP (no dedicated amd / nvidia gpu's) then the cpu's suck much more.

Example:
Mobile i7-2670 runs 7-zip 8/8 at avg. 12500 where a i7-2630 with nvidia mobile gpu runs at 15244~15384 8/8, same hardware except for gpu used and swapped out cpu for testing. You would expect it to be the opposite results.


----------



## faramir (Feb 20, 2012)

So how does HD4000 compare to A8-3850/3870 graphics then ?

I understand Trinity figures aren't avaliable yet, I can't wait for the rumored performance of Kaveri though, finally a decent enough integrated GPU with absolutely no need for additional GPU, if only they can get the CPU IPC performance up from the 2006 first generation Phenom level ...


----------



## DarkOCean (Feb 20, 2012)

and the big differnce comes from optimization for 3dmark , games alone its only 56% still ok for an intel igp.


----------



## Borc (Feb 20, 2012)

Casecutter said:


> 67.25% faster than that of a 2007 IGP is good?



2007? Are you joking? HD3000 was launched in 2011 last year.


----------



## Yo_Wattup (Feb 20, 2012)

Still quite mediocre IMO.


----------



## HTC (Feb 20, 2012)

faramir said:


> *So how does HD4000 compare to A8-3850/3870 graphics then ?*
> 
> I understand Trinity figures aren't avaliable yet, I can't wait for the rumored performance of Kaveri though, finally a decent enough integrated GPU with absolutely no need for additional GPU, if only they can get the CPU IPC performance up from the 2006 first generation Phenom level ...



Since there shouldn't be any review of this available, are there any of A8-38X0 pitted against GeForce GT 240?


----------



## Completely Bonkers (Feb 20, 2012)

Quite impressive. Now if you could JOIN that performance with a budget gaming card, you'd be good to go. What a shame that the company developing that concept left the market. What was it called again?

How about the Xeon dual socket version of this chip. If it could combine combine graphics performance, now that would be decent enough for most people, and every reason for everyone to buy a workstation board and for Intel to sell twice as many CPU chips 

In fact, they could go back in time to the 386 and 387 math coprocessor concept. Only this time it would be GPU coprocessor. They could build a sister-chip that had half the CPU cores but double the GPU core/shaders, and it would make a marvellous combination.


----------



## Inceptor (Feb 20, 2012)

Borc said:


> 2007? Are you joking? HD3000 was launched in 2011 last year.



I think he means 2007 era discrete graphics performance.


----------



## HTC (Feb 20, 2012)

HTC said:


> Since there shouldn't be any review of this available, *are there any of A8-38X0 pitted against GeForce GT 240?*



Found a review with both an A8-3850 and a GT 240 here

It has mixed results: a GT 240 has performance over the A8-3850 (with the RAM @ 1866) between 91% and 120%.

Would prefer a more comprehensive review for this comparison but was unable to locate one 

EDIT

With this, it seems that Core i5-3570K Graphics still isn't up to the graphics of an A8-3850. It's a whole new ball game when you factor the CPU portion of the chip.


----------



## v12dock (Feb 20, 2012)

Who did they get the design from this time around


----------



## Casecutter (Feb 20, 2012)

Borc said:


> HD4000 is 36% slower than GT240


Ah, what HD4000?  If a 4670 well yes, a GT240 DDR5 that came along 14 months later did better it, but I see it about 8% higher @ 1280x. 
http://www.techpowerup.com/reviews/MSI/GeForce_GT_240/30.html



Borc said:


> 2007? Are you joking? HD3000 was launched in 2011 last year


While didn’t do anything better than a the 780G.



Yo_Wattup said:


> Still quite mediocre IMO.


I second that


----------



## Dent1 (Feb 20, 2012)

What confuses me is, why does AMD put the discrete video cards on the mainstream CPUs. Why don't they put GPUs on the enthusiast range too. Like surely they could put a 6850 on a Bulldozer or Phenom II die if they wanted?


----------



## nuno_p (Feb 20, 2012)

Dent1 said:


> What confuses me is, why does AMD put the discrete video cards on the mainstream CPUs. Why don't they put GPUs on the enthusiast range too. Like surely they could put a 6850 on a Bulldozer or Phenom II die if they wanted?



Its not that simple.


----------



## Casecutter (Feb 20, 2012)

Dent1 said:


> What confuses me is, why does AMD put the discrete video cards on the mainstream CPUs. Why don't they put GPUs on the enthusiast range too. Like surely they could put a 6850 on a Bulldozer or Phenom II die if they wanted?




The Trinity A10 lineup will come with Radeon HD 7660D or something that approaches or betters the current 6570 discrete.  Now consider that will perform similar to a 9800GTX 512Mb from 2008, which had a TDP of 168W.  Today they combine a CPU and the GPU and keep it under 100W.  In 4 years that’s pretty amazing wouldn't you say?

As to why they don't... it comes down to power and heat, most anyone or the OEM's that builds and markets the volume of general use computers have to do it for a price and within "green" efficiency. While at this time cooling would need to be developed and I would consider a prohibitive cost.  But give it two years and you'll probably be getting 7770 performance with an APU.


----------



## Thefumigator (Feb 20, 2012)

Dent1 said:


> What confuses me is, why does AMD put the discrete video cards on the mainstream CPUs. Why don't they put GPUs on the enthusiast range too. Like surely they could put a 6850 on a Bulldozer or Phenom II die if they wanted?



Because llano A8 is a solution in the mainstream where you count every penny.
You don't count every penny on the enthusiast range, so a more powerful GPU integrated to a Phenom II or bulldozer won't necesarelly means the enthusiast crowd will buy it, first because you will be tied to that integrated GPU until you buy a discrete card, and second, no matter how powerful the integrated GPU is, it shares memory, and that makes performance drop, also DDR3 is not comparable to GDDR5 in any way.



Yo_Wattup said:


> Still quite mediocre IMO.



agree.
And I'm not counting graphics quality in 3D, microstutterings, and compatibility.


----------



## repman244 (Feb 20, 2012)

Dent1 said:


> What confuses me is, why does AMD put the discrete video cards on the mainstream CPUs. Why don't they put GPUs on the enthusiast range too. Like surely they could put a 6850 on a Bulldozer or Phenom II die if they wanted?



The TDP would be too high I imagine.


----------



## Halk (Feb 20, 2012)

There's now a significant amount of graphical grunt in these CPUs... does anyone else feel a little touch of regret forking out for a fair portion of silicon that will go unused?


----------



## eidairaman1 (Feb 20, 2012)

Halk said:


> There's now a significant amount of graphical grunt in these CPUs... does anyone else feel a little touch of regret forking out for a fair portion of silicon that will go unused?



Considering Gamers/Enthusiasts wont use the Gpu portion. Plus there is no way to link it with Crossfire/SLi. I think Intel should honestly release non Gpu models that dont have the gpu disabled(no wasted space on cpu) for less price.


----------



## Dent1 (Feb 20, 2012)

Random question, with AMD's Llano APUs, can the discrete GPU crossfire with a dedicated GPU?


----------



## Zen_ (Feb 20, 2012)

Dent1 said:


> Random question, with AMD's Llano APUs, can the discrete GPU crossfire with a dedicated GPU?



Yes, the graphics in Fusion APU's can be CF'd with certain discrete cards. I guess it's technically a value added bonus, but in practice if the APU graphics are not enough for your needs a regular CPU + discrete card is a better value. 

Personally I sold my 2500k + 6850 awhile ago since the only games I play now are Day of Defeat Source & Guild Wars...replaced with cheapo A4-3400 and it works just fine. Intel needs to stay competitive with AMD in integrated graphics because there are a lot of people who do want to play games like League of Legends, CoD4, CS:S, and flash based games that can be GPU accelerated more efficiently (think laptop users play facebook games). There's no reason for this crowd to have a discrete card. Also, the capabilities of GPGPU have been steadily growing.


----------



## Suhidu (Feb 20, 2012)

Dent1 said:


> Random question, with AMD's Llano APUs, can the discrete GPU crossfire with a dedicated GPU?







Only up to HD 6670 (if A8/A6).
http://www.amd.com/us/products/technologies/dual-graphics/pages/dual-graphics.aspx



Dent1 said:


> What confuses me is, why does AMD put the discrete video cards on the mainstream CPUs. Why don't they put GPUs on the enthusiast range too. Like surely they could put a 6850 on a Bulldozer or Phenom II die if they wanted?


Good points have already been made about why this isn't done, but another thing to consider is that the FM1 socket, platform, boards, etc... was built with display output in mind (as where AM3+ was built from old AM3).


----------



## Steevo (Feb 20, 2012)

What is this 720 bullshit?


Again, Intel, I haven't run those resolutions for 10+ years except on the laptops. And we aren't comparing a mobile chip are we? So this wonderful news is just fodder for the toilet paper.


----------



## NC37 (Feb 20, 2012)

lol, people actually thought Intel would get decent graphics for IB. Intel IGPs...the perpetual Bulldozer of graphics


----------



## EpicShweetness (Feb 20, 2012)

eidairaman1 said:


> Considering Gamers/Enthusiasts wont use the Gpu portion. Plus there is no way to link it with Crossfire/SLi. I think Intel should honestly release non Gpu models that dont have the gpu disabled(no wasted space on cpu) for less price.



Not entirely sure if that would be efficient, correct me if I'm wrong but aren't the IGP parts in the Intel chips able to gate themselves so much as to have 1w or less? If so making different silicon would be a waste of fabrication.
At any rate I agree the raw x86 power, and cost of the Intel chip's negate the desire for a "performance" IGP, that's AMD's job honestly. Most Intel rig's run discrete graphics, but those that don't are for work/business, so why make the IGP more powerful if all your gonna do is basic functions!
Hmmm ... ... wonder what the TDP would be on "Ivy" if they just shrunk HD 3000?


----------



## Borc (Feb 20, 2012)

HTC said:


> Found a review with both an A8-3850 and a GT 240 here




Anandtech used a GT240 with DDR3 1600 DDR3 memory, expreview DDR3 1800 memory.


----------



## PopcornMachine (Feb 20, 2012)

My only problem with Sandy Bridge is the IGP that I'm never going to use.

Probably the same for Ivy Bridge, but can't say until I see some benchmarks and know the pricing.

Of course an IGP that could really do something is another matter.  If it's not even a GT240, then saying it's 67% faster than 2500k is just marketing talk targeted at stupid people.

Regardless, don't care how they compare in that regard at all.  This information is next to useless.


----------



## NdMk2o1o (Feb 20, 2012)

Completely Bonkers said:


> How about the Xeon dual socket version of this chip. If it could combine combine graphics performance, now that would be decent enough for most people, and every reason for everyone to buy a workstation board and for Intel to sell twice as many CPU chips



Sure let's all buy a dual socket board for $500, 2 xeon processors for $600 just to get similar gaming performance as a mid range GPU that costs $50. You should work for Intel...


----------



## Wile E (Feb 20, 2012)

Meh. Again, not very useful to me.


----------



## LAN_deRf_HA (Feb 21, 2012)

How much is the power consumption of the HD4000? If it's like 10w watts compared to the GT240's 50w it might be a bit more impressive.


----------



## Halk (Feb 21, 2012)

Perhaps they'll introduce a smaller subset with busted graphics modules? In the same way AMD have tri-core and dual core chips which are cut down because they didn't validate then Intel will sell on the units that didn't validate on graphics?


----------



## badtaylorx (Feb 21, 2012)

given the 3dmark P score and the GPU P score can we not extrapolate the cpu score???


----------



## xenocide (Feb 21, 2012)

Not a bad improvement over the last generation.  Obviously a discrete solution will always be preferable, but I was expecting minimal gains going from HD3000->HD4000.


----------



## qubit (Feb 21, 2012)

So the graphics are way better but still weak? That figures.

However, the super high resolution capability I reported on here should be interesting.


----------



## mastrdrver (Feb 21, 2012)

Has anyone played those games on a 2500k and able to confirm at those settings that it is not a roller coaster of frame rates from high to low as the recent Intel iGPU is known for?

Also how well do these in game time demo/benchmarks compare to real game play? I know usually they don't.


----------



## xenocide (Feb 21, 2012)

mastrdrver said:


> Also how well do these in game time demo/benchmarks compare to real game play? I know usually they don't.



Most of the time they are just on-rails rendered scenarios.  Half-Life 2 features one, and they are generally pretty accurate, if not a little less stressful than real-time gameplay.


----------



## Nihilus (Feb 21, 2012)

*Great Imporovement!*

This is pretty awesome for a CPU with this much power.  Yeah yeah I'm sure the Llanos will bet it but I guess they wanted to leave room for the GPU guys.  Compare the Processing power of this chip to the Llano and these will smoke 'em.


----------



## naoan (Feb 21, 2012)

I see no reason to dislike this progress.


----------



## johnspack (Feb 21, 2012)

Nice,  should make for really efficient htpcs!  Can't wait for ivy bridge e.....


----------



## Xiphos (Feb 21, 2012)

Why are people dissing on better graphics performance? last time I checked better is good.

what I really want to know is how much faster is the quick sync feature on the Core i5-3570K than the i5 2500k?

Yeah, IGP may not be of any use to majority of gamers. but for gamers who also record and render videos, the IGP along with quick sync is very useful.

if quick sync on the Core i5-3570K Graphics is also 67% faster than Core i5-2500K, I think I might just have to upgrade.


----------



## eidairaman1 (Feb 21, 2012)

Xiphos said:


> Why are people dissing on better graphics performance? last time I checked better is good.
> 
> what I really want to know is how much faster is the quick sync feature on the Core i5-3570K than the i5 2500k?
> 
> ...



Dude You cant run the IGP and a Separate GPU at same time on that setup, either u game on a discreet GPU or you Game on the IGP cant use both at same time


----------



## ensabrenoir (Feb 21, 2012)

Xiphos said:


> Why are people dissing on better graphics performance? last time I checked better is good.
> 
> what I really want to know is how much faster is the quick sync feature on the Core i5-3570K than the i5 2500k?
> 
> ...



Yes better is good... but they've still a ways to go . this has forever  been Intels' arrow in the knee.   I forsee intel conquering this soon though....with M O N E Y.  They have no choice really.... they must venture where Amd/ati rules. And with amd's current rate of blood shed....there is opportunity.


----------



## 1c3d0g (Feb 21, 2012)

Awesome! Intel's Sandy Bridge IGP is great for HD (Blu-ray) videos. I encountered some stuttering even with a GeForce 520 when I played The Dark Knight in its full glory, but none with the HD3000. This was with Splash Pro EX, which gives lots of post-processing effects options (thus extremely GPU intensive) for videos. Hopefully Ivy Bridge will continue this great trend.


----------



## hellrazor (Feb 21, 2012)

Yay my video card is compared to something!


----------



## Crazykenny (Feb 21, 2012)

This is a Intel chip generation I am gonna skip in its entirety. No use upgrading to it from a 2600k and I dare say, its kinda a waste upgrading to it from a 1366 platfrom. Intel cut its own fingers by making such a solid, long-lasting chip


----------



## faramir (Feb 21, 2012)

Dent1 said:


> What confuses me is, why does AMD put the discrete video cards on the mainstream CPUs. Why don't they put GPUs on the enthusiast range too. Like surely they could put a 6850 on a Bulldozer or Phenom II die if they wanted?



Serious TDP contraints. You cannot just take 125W CPU and 200+W GPU, glue them together and expect the combination to run in home PCs. In my opinion 100W is maximum reassonable TDP for the CPU/APU (this leaves enough room for overclockers) and 65W or thereabouts should be the target for non-enthusiast market (like my existing Core2 duo, which when undervolted runs under 50 degrees C under load, hence no annoying jet engine noises from the heatsink fan).


----------



## hardcore_gamer (Feb 21, 2012)

Most people who buy these CPUs wont use the integrated graphics. What a waste of die area.


----------



## laszlo (Feb 21, 2012)

hardcore_gamer said:


> Most people who buy these CPUs wont use the integrated graphics. What a waste of die area.



No waste of die as i saw;you forget OEM who build office pc's and sure for that use is more than enough

also don't forget that many home user don't play games;for surfing and Face-shot-off-Book u don't need discrete graphic cars


----------



## Casecutter (Feb 21, 2012)

laszlo said:


> No waste of die as i saw;you forget OEM who build office pc's and sure for that use is more than enough
> 
> also don't forget that many home user don't play games;for surfing and Face-shot-off-Book u don't need discrete graphic cars


Not dis'n what you said, and while improvement is welcome,  i5's aren't something most corporate office or home use folk shouldn’t be compelled to step up to the i5 level just to receive what's still barely acceptable graphics performance (this isn't about gaming), but the ability to truly multi-task which APU's prove can be done. People shouldn’t have to cease background tasks, just to watch the video from an Email.  

The bigger question is what Intel is going to straddle entry i3 with… HD2000 still?  That’s what this write-up couldn’t express; and why is that?  Nice that i5 buyer gets a bone, but let’s face it that person will be looking for an upgrade almost from the day they take it home.  At least if straddled with a decent OEM 350W, AMD will get a 7750 sale.  If Intel can't give this to the i3 buyer it's truely wrothless news!


----------



## Halk (Feb 21, 2012)

laszlo said:


> No waste of die as i saw;you forget OEM who build office pc's and sure for that use is more than enough
> 
> also don't forget that many home user don't play games;for surfing and Face-shot-off-Book u don't need discrete graphic cars



That's the thing... it's more than enough for office use, and not enough for gaming use.

Why bother with the upgrade if it's not going to be any benefit for either? 

As far as a bang for buck, cheap as you can get gaming experience.. then the CPU is probably too much for the graphics on it.

All I can think of is that Intel are producing integrated graphics that are cutting back on them to the point where there's no reasonable gains.. so any less grunt in the graphics department doesn't reduce costs or thermals.


----------



## mastrdrver (Feb 21, 2012)

Xiphos said:


> Why are people dissing on better graphics performance? last time I checked better is good.



Because this is Intel and drivers, need I say more?

Besides that, these are not official and very little is known about the drivers used, actual game play, and settings used for the benchmark. This is like cheering for new nVidia stuff when all that comes out is a leak. Who knows how really good or bad it is when it comes to final silicone.


----------



## Xiphos (Feb 22, 2012)

eidairaman1 said:


> Dude You cant run the IGP and a Separate GPU at same time on that setup, either u game on a discreet GPU or you Game on the IGP cant use both at same time



bro, you heard of Lucid's Virtu? it lets you switch between IGP and a dedicated card on the fly. no need to reboot. 

ok, now we know you can switch easily. here is what you do;
you game and record with your dedicated card.  and then use the IGP while rendering your 360 degree triple kill clip in After Effects or Sony Vegas. 

revolutionary idea huh? I know.


----------



## ViperXTR (Feb 22, 2012)

yes lucid virtu, i use th HD2000 of my i3 to do some quicksync from time to time, also tried using it in  media player classic as the default DXVA device (through virtu profile).


----------



## 1c3d0g (Feb 22, 2012)

Casecutter said:


> ...
> The bigger question is what Intel is going to straddle entry i3 with… HD2000 still? ...



The HD2500. Which is a step up from HD2000 (8 EU's v.s. 6 EU's, although they're not directly comparable either, due to significant architectural differences between Ivy > Sandy).


----------



## Casecutter (Feb 23, 2012)

1c3d0g said:


> The HD2500. Which is a step up from HD2000 (8 EU's v.s. 6 EU's, although they're not directly comparable either, due to significant architectural differences between Ivy > Sandy).


Be still my gentle heart... and how would CPU architectural improvements really improve graphic components? 

Business machines are fine for HD2500, but home/personal users today should not be left with Intel's spare change. Basically, this means Intel aim is to have home computers buyers looking to the higher cost i5, lulled with the idea of improved graphic's power. Just to find they were sold the functionality more popular in 2008.  More CPU power than most general home user could ever fancy, while below middling graphics ability.  AMD should have no problem marketing the APU’s balanced approach and multi-task abilities this next round.


----------

