Monday, February 20th 2012
Core i5-3570K Graphics 67% Faster Than Core i5-2500K, 36% Slower Than GeForce GT 240
An Expreview community member ran benchmarks comparing the performance of the Intel HD 4000 graphics embedded into its upcoming 22 nm "Ivy Bridge" Core i5-3570K, comparing it to the integrated graphics of Core i5-2500K, and discrete graphics NVIDIA GeForce GT 240. These tests are endorsed by the site. The suite of benchmarks included games that aren't quite known to be very taxing on graphics hardware by today's standards, yet are extremely popular; games such as StarCraft II, Left4Dead 2, DiRT 3, Street Fighter IV. Some of the slightly more graphics-intensive benchmarks included Far Cry 2 and 3DMark Vantage. All benchmarks were run at 1280 x 720 resolution.
The Intel HD 4000 graphics core beats the HD 3000 hands down, with performance leads as high as 122% in a particular test. The chip produces more than playable frame-rates with Left4Dead 2 and Street Fighter IV, both well above 50 FPS, even DiRT 3 and Far Cry 2 run strictly OK, over 30 FPS. StarCraft II is where it produced under 30 FPS, so the chip might get bogged down in intense battles. A mainstream discrete GeForce or Radeon is a must. On average, the graphics core embedded into the Core i5-3570K was found to be 67.25% faster than the one on the Core i5-2500K.When pitted against a 2+ year old GeForce GT 240, the Core i5-3570K struggles. In StarCraft II, it's 53.64% slower. On average, the GT 240 emerged 56.25% faster. A decent effort by Intel to cash in on the entry-level graphics. We are hearing nice things about the HD video playback and GPU acceleration capabilities of Intel's HD 4000 core, and so there's still something to look out for. Agreed, comparing the i5-3570K to the i5-2500K isn't a 100% scientific comparison since the CPU performance also factors in, but it was done purely to assess how far along Intel has come with its graphics.
Source:
Expreview
The Intel HD 4000 graphics core beats the HD 3000 hands down, with performance leads as high as 122% in a particular test. The chip produces more than playable frame-rates with Left4Dead 2 and Street Fighter IV, both well above 50 FPS, even DiRT 3 and Far Cry 2 run strictly OK, over 30 FPS. StarCraft II is where it produced under 30 FPS, so the chip might get bogged down in intense battles. A mainstream discrete GeForce or Radeon is a must. On average, the graphics core embedded into the Core i5-3570K was found to be 67.25% faster than the one on the Core i5-2500K.When pitted against a 2+ year old GeForce GT 240, the Core i5-3570K struggles. In StarCraft II, it's 53.64% slower. On average, the GT 240 emerged 56.25% faster. A decent effort by Intel to cash in on the entry-level graphics. We are hearing nice things about the HD video playback and GPU acceleration capabilities of Intel's HD 4000 core, and so there's still something to look out for. Agreed, comparing the i5-3570K to the i5-2500K isn't a 100% scientific comparison since the CPU performance also factors in, but it was done purely to assess how far along Intel has come with its graphics.
62 Comments on Core i5-3570K Graphics 67% Faster Than Core i5-2500K, 36% Slower Than GeForce GT 240
also don't forget that many home user don't play games;for surfing and Face-shot-off-Book u don't need discrete graphic cars
The bigger question is what Intel is going to straddle entry i3 with… HD2000 still? That’s what this write-up couldn’t express; and why is that? Nice that i5 buyer gets a bone, but let’s face it that person will be looking for an upgrade almost from the day they take it home. At least if straddled with a decent OEM 350W, AMD will get a 7750 sale. If Intel can't give this to the i3 buyer it's truely wrothless news!
Why bother with the upgrade if it's not going to be any benefit for either?
As far as a bang for buck, cheap as you can get gaming experience.. then the CPU is probably too much for the graphics on it.
All I can think of is that Intel are producing integrated graphics that are cutting back on them to the point where there's no reasonable gains.. so any less grunt in the graphics department doesn't reduce costs or thermals.
Besides that, these are not official and very little is known about the drivers used, actual game play, and settings used for the benchmark. This is like cheering for new nVidia stuff when all that comes out is a leak. Who knows how really good or bad it is when it comes to final silicone.
ok, now we know you can switch easily. here is what you do;
you game and record with your dedicated card. and then use the IGP while rendering your 360 degree triple kill clip in After Effects or Sony Vegas.
revolutionary idea huh? I know.
Business machines are fine for HD2500, but home/personal users today should not be left with Intel's spare change. Basically, this means Intel aim is to have home computers buyers looking to the higher cost i5, lulled with the idea of improved graphic's power. Just to find they were sold the functionality more popular in 2008. More CPU power than most general home user could ever fancy, while below middling graphics ability. AMD should have no problem marketing the APU’s balanced approach and multi-task abilities this next round.