Monday, February 20th 2012

Core i5-3570K Graphics 67% Faster Than Core i5-2500K, 36% Slower Than GeForce GT 240

An Expreview community member ran benchmarks comparing the performance of the Intel HD 4000 graphics embedded into its upcoming 22 nm "Ivy Bridge" Core i5-3570K, comparing it to the integrated graphics of Core i5-2500K, and discrete graphics NVIDIA GeForce GT 240. These tests are endorsed by the site. The suite of benchmarks included games that aren't quite known to be very taxing on graphics hardware by today's standards, yet are extremely popular; games such as StarCraft II, Left4Dead 2, DiRT 3, Street Fighter IV. Some of the slightly more graphics-intensive benchmarks included Far Cry 2 and 3DMark Vantage. All benchmarks were run at 1280 x 720 resolution.

The Intel HD 4000 graphics core beats the HD 3000 hands down, with performance leads as high as 122% in a particular test. The chip produces more than playable frame-rates with Left4Dead 2 and Street Fighter IV, both well above 50 FPS, even DiRT 3 and Far Cry 2 run strictly OK, over 30 FPS. StarCraft II is where it produced under 30 FPS, so the chip might get bogged down in intense battles. A mainstream discrete GeForce or Radeon is a must. On average, the graphics core embedded into the Core i5-3570K was found to be 67.25% faster than the one on the Core i5-2500K.
When pitted against a 2+ year old GeForce GT 240, the Core i5-3570K struggles. In StarCraft II, it's 53.64% slower. On average, the GT 240 emerged 56.25% faster. A decent effort by Intel to cash in on the entry-level graphics. We are hearing nice things about the HD video playback and GPU acceleration capabilities of Intel's HD 4000 core, and so there's still something to look out for. Agreed, comparing the i5-3570K to the i5-2500K isn't a 100% scientific comparison since the CPU performance also factors in, but it was done purely to assess how far along Intel has come with its graphics.
Source: Expreview
Add your own comment

62 Comments on Core i5-3570K Graphics 67% Faster Than Core i5-2500K, 36% Slower Than GeForce GT 240

#51
hellrazor
Yay my video card is compared to something!
Posted on Reply
#52
Crazykenny
This is a Intel chip generation I am gonna skip in its entirety. No use upgrading to it from a 2600k and I dare say, its kinda a waste upgrading to it from a 1366 platfrom. Intel cut its own fingers by making such a solid, long-lasting chip :laugh:
Posted on Reply
#53
faramir
Dent1What confuses me is, why does AMD put the discrete video cards on the mainstream CPUs. Why don't they put GPUs on the enthusiast range too. Like surely they could put a 6850 on a Bulldozer or Phenom II die if they wanted?
Serious TDP contraints. You cannot just take 125W CPU and 200+W GPU, glue them together and expect the combination to run in home PCs. In my opinion 100W is maximum reassonable TDP for the CPU/APU (this leaves enough room for overclockers) and 65W or thereabouts should be the target for non-enthusiast market (like my existing Core2 duo, which when undervolted runs under 50 degrees C under load, hence no annoying jet engine noises from the heatsink fan).
Posted on Reply
#54
hardcore_gamer
Most people who buy these CPUs wont use the integrated graphics. What a waste of die area.
Posted on Reply
#55
laszlo
hardcore_gamerMost people who buy these CPUs wont use the integrated graphics. What a waste of die area.
No waste of die as i saw;you forget OEM who build office pc's and sure for that use is more than enough

also don't forget that many home user don't play games;for surfing and Face-shot-off-Book u don't need discrete graphic cars
Posted on Reply
#56
Casecutter
laszloNo waste of die as i saw;you forget OEM who build office pc's and sure for that use is more than enough

also don't forget that many home user don't play games;for surfing and Face-shot-off-Book u don't need discrete graphic cars
Not dis'n what you said, and while improvement is welcome, i5's aren't something most corporate office or home use folk shouldn’t be compelled to step up to the i5 level just to receive what's still barely acceptable graphics performance (this isn't about gaming), but the ability to truly multi-task which APU's prove can be done. People shouldn’t have to cease background tasks, just to watch the video from an Email.

The bigger question is what Intel is going to straddle entry i3 with… HD2000 still? That’s what this write-up couldn’t express; and why is that? Nice that i5 buyer gets a bone, but let’s face it that person will be looking for an upgrade almost from the day they take it home. At least if straddled with a decent OEM 350W, AMD will get a 7750 sale. If Intel can't give this to the i3 buyer it's truely wrothless news!
Posted on Reply
#57
Halk
laszloNo waste of die as i saw;you forget OEM who build office pc's and sure for that use is more than enough

also don't forget that many home user don't play games;for surfing and Face-shot-off-Book u don't need discrete graphic cars
That's the thing... it's more than enough for office use, and not enough for gaming use.

Why bother with the upgrade if it's not going to be any benefit for either?

As far as a bang for buck, cheap as you can get gaming experience.. then the CPU is probably too much for the graphics on it.

All I can think of is that Intel are producing integrated graphics that are cutting back on them to the point where there's no reasonable gains.. so any less grunt in the graphics department doesn't reduce costs or thermals.
Posted on Reply
#58
mastrdrver
XiphosWhy are people dissing on better graphics performance? last time I checked better is good.
Because this is Intel and drivers, need I say more?

Besides that, these are not official and very little is known about the drivers used, actual game play, and settings used for the benchmark. This is like cheering for new nVidia stuff when all that comes out is a leak. Who knows how really good or bad it is when it comes to final silicone.
Posted on Reply
#59
Xiphos
eidairaman1Dude You cant run the IGP and a Separate GPU at same time on that setup, either u game on a discreet GPU or you Game on the IGP cant use both at same time
bro, you heard of Lucid's Virtu? it lets you switch between IGP and a dedicated card on the fly. no need to reboot.

ok, now we know you can switch easily. here is what you do;
you game and record with your dedicated card. and then use the IGP while rendering your 360 degree triple kill clip in After Effects or Sony Vegas.

revolutionary idea huh? I know.
Posted on Reply
#60
ViperXTR
yes lucid virtu, i use th HD2000 of my i3 to do some quicksync from time to time, also tried using it in media player classic as the default DXVA device (through virtu profile).
Posted on Reply
#61
1c3d0g
Casecutter...
The bigger question is what Intel is going to straddle entry i3 with… HD2000 still? ...
The HD2500. Which is a step up from HD2000 (8 EU's v.s. 6 EU's, although they're not directly comparable either, due to significant architectural differences between Ivy > Sandy).
Posted on Reply
#62
Casecutter
1c3d0gThe HD2500. Which is a step up from HD2000 (8 EU's v.s. 6 EU's, although they're not directly comparable either, due to significant architectural differences between Ivy > Sandy).
Be still my gentle heart... and how would CPU architectural improvements really improve graphic components? :cool:

Business machines are fine for HD2500, but home/personal users today should not be left with Intel's spare change. Basically, this means Intel aim is to have home computers buyers looking to the higher cost i5, lulled with the idea of improved graphic's power. Just to find they were sold the functionality more popular in 2008. More CPU power than most general home user could ever fancy, while below middling graphics ability. AMD should have no problem marketing the APU’s balanced approach and multi-task abilities this next round.
Posted on Reply
Add your own comment
Nov 25th, 2024 12:14 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts