• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Core i5-3570K vs. i7-3770K Ivy Bridge

Are we testing CPUs or GPUs? If you were testing in 1440p than in gaming tests is needed only one sentence: All games are limited by GPU, it doesn't matter which CPU you have.

Oh really, so you would pair a AMD Sempron 145 Single Core processor and GTX 680s in SLI to game at 1440p since "All games are limited by the GPU, it doesn't matter which CPU you have."

It doesn't work that way. If anything, gaming at 1440P with super high end GPUs will need a fast processor. Even the guys at PC Per found that with GTX Titan's SLI, there's a CPU bottleneck with an i7-3770K trying to drive those two massive cards.

People seem to think "low resolutions tests CPU performance" but in reality, it's both extreme ends that tests a CPU's performance. The "middle range" which is 1080p, a GTX 660Ti-670-Radeon-7870-7950, and i5-i7-FX8350 is where you can't tell a difference.
 
Are we testing CPUs or GPUs? If you were testing in 1440p than in gaming tests is needed only one sentence: All games are limited by GPU, it doesn't matter which CPU you have.

If you still want to test in 1920x1080 it would be needed to test with Dual high end graphics cards (Crossfire/SLI)

Tell that to Crysis 3, where increasing core speed is the best way to increase your gpu usage.

You're arguing that we test for results that we expect to get. That doesn't tell you anything. If you test a dozen games at 1440p and only one of them scaled well with certain cpus then that's still worth the effort because at least you got one useful result. Low res testing gives you no useful results.
 
any other games that you can recommend for cpu testing?

Skyrim, Battle Field 3, Dragon Age 2, and some of the games using latest Unreal Engine. :o
 
Dang it! I should've just paid that $30 for the 3770k.
 
I guess it all depends on the particular CPU you get and what your main purpose is. I primarily do memory overclocking/benching, and the 3570k I have trounced the 3770k I bought and subsequently got rid of. The 3570k can do 2800 at least, the 3770 was limited to 2666. Luck of the draw as to how good the IMC is. Also my 3570k can do 4.5GHz with only 1.20v.
 
You say this...but these extra threads/cores give you a HUGE advantage in many of the games we play each day. Benchmarks alone are testing a very specific environment that are not very CPU dependent. Games like MMO's, online-FPS's, etc all require significant CPU grunt and the difference in both threads, clocks, cache, etc make significant gains, ESPECIALLY in minimum FPS. The old "put a rig together and throw a benchmark run of a singleplayer game" only tests the GPU horsepower, it does almost nothing to test real-world CPU stresses.

I understand all this, however a strong well done test of performance will show the varying degrees of performance. So you play GameTitleA and your min FPS = 40 with your i7, however with the same system but an i3 your min FPS = 38, now you have a better understanding of exactly what you are spending money for.

That being said, I have a 3770K overclocked, you can see my system specs, I get nearly no difference in Min, Avg, Max FPS rates in D3, SC2, BF3, Bad Company 2 with HT On vs Off, nothing worth noting and no general system performance increase that would make me spend the extra money on the i7.

However I have my i7, because I run at any given time 1-4 VMs and often have audio/video work going on in the background while I'm playing a game, in which cases leaving HT on can make a noticeable difference.

Most people are surprised not by the fact i7 is faster than an i5, but how little the difference is in reality for their situation.

IME most people are very pleased with the gaming performance of their i3 and gpu no matter the title.

(Granted most people are not rollin GTX 680s either...) :toast:
 
Hexus PiFast is a single-thread Pi calculation software, which makes Hyper-Threading useless while running it. The results confirm that, since the i5-3570K beat the i7-3770K @ 4.5 GHz.

I would REALLY like to see how the 3770K performs with hyper-threading turned off in a lot of these tests. Just to see how much the extra cache helps and how much hyper-threading decreases single-threaded performance.
 
I would REALLY like to see how the 3770K performs with hyper-threading turned off in a lot of these tests. Just to see how much the extra cache helps and how much hyper-threading decreases single-threaded performance.

HT doesn't hurt much, you could probably expect it to balance out with the 3570 nearly identically, the extra cache will help a bit in general responsiveness but I doubt much more.

I can probably get my hands on a 3570 to do some of this testing in otherwise identical hardware because with numbers so close that's going to matter.
 
  • Like
Reactions: xvi
Regarding gaming performance.

This CPUs are GPU limited. Use a 2 or 3 way SLI to check the difference (if there is any) between an i5 and i7.
 
I would REALLY like to see how the 3770K performs with hyper-threading turned off in a lot of these tests. Just to see how much the extra cache helps and how much hyper-threading decreases single-threaded performance.

HT helps in some game and hurts in others, but we are talking at max 5% difference. No, the reason hard-core OC turn it off is to gain extra stability and higher clocks on the CPU.
 
Krneki, Please do not double or triple post. Use the multiqoute button.
 
HT helps in some game and hurts in others, but we are talking at max 5% difference. No, the reason hard-core OC turn it off is to gain extra stability and higher clocks on the CPU.

I am confused as to where you get this information.

There are lots of games that benefit heavily from HT. 5% is an arbitrary number..
 
i would like to know if there is any difference between smoothnesses of the gameplays. Frametime in Msi Afterburner can be a good criteria.
 
I am confused as to where you get this information.

There are lots of games that benefit heavily from HT. 5% is an arbitrary number..

I have to agree with Krneki on this one. While there are some games where you can find more significant gains, the average is indeed around 5% or less.
The reason is that 3 cores are usually enough to serve most of the current game engines, and that's why HT gives much more significant performance gains on dualcore CPUs compared to quads.
On the other hand, HT optimization also advanced in the last few years, so engines tend to get out more from HT than how it was before.

I spent a huge amount of time performing various benches on my L4D2 server which has a dualcore HT enabled CPU. While the Source client is heavily multi-threaded, the dedicated server binaries are not, they are "only" optimized for HT (well, the network load can be assigned to a separate core if available, but it doesn't make much difference in the bench results). I spawned different number of zombies on the server in the same scenarios (from a few to thousands!), and I can tell you that HT helped a lot. Source is one of the most optimized engines currently out there, and it's proves that proper optimization for HT is indeed provides better performance. I also found similar results while testing performance in bf:bc2 which is - apparently - also well optimized for HT.

But still, in this case (i5 vs i7), saying "only" 5% is probably the right number, or perhaps it's even too high.
 
Games were GPU limited. You need to lower resolution a little or use a SLI or 3 SLI setup for this... ;)
 
Last edited:
Games were GPU limited. You need to lower resolution a little and also use a SLI or 3 SLI setup for this... ;)

do you play your games at lower resolution with a triple sli setup?
 
do you play your games at lower resolution with a triple sli setup?

This, they test this way to show us, the consumers, what real world differences we are going to see at resolutions that we play at. Yes, we all know lowering the resolution will exaggerate the differences in cpu power and clearly outline which cpu has the most raw grunt, but what good is it if at 1080p the difference between cpu a and cpu b is 1 fps. This is what you need to understand.

If you play at 800x600 then sure that would be more valid.
 
I would REALLY like to see how the 3770K performs with hyper-threading turned off in a lot of these tests. Just to see how much the extra cache helps and how much hyper-threading decreases single-threaded performance.

Good idea!
 
Good idea!

This is highly application dependent. It will decrease performance if the two virtual cores have to stall because of each other or even the lack of HT optimization is enough sometimes, but with the case of Ivy Bridge, single threaded codes would not suffer as much as if it would with Sandy Bridge (or before). Ivy Bridge can dynamically allocate resources if only one thread is active in the core, opposed to the solution in SB (and before) where some was left unused with the static allocation. I don't really remember the exact figures, but things were only 1-2% percent slower in the few worst cases with HT disabled, while leaving HT enabled tended to increase performance in most of the multithreaded situations.
 
http://forums.anandtech.com/showthread.php?t=2249262

This was one of the forum posts that discusses it. Great data provided here. Someone who went from an i5-3570k to an i7-3770k to an i7-3930k.

I searched the topic for 5 min and I couldn't find any date supporting your claim about HT. More to it I found this replay.

Yes, I did same testing with HT off and there was just no measurable difference.

P.S: Don't take this as a personal attack, I only want to know how much HT will improve my FPS in games so I can better decide what my next CPU will be when I will need to upgrade my i5 750@4Ghz. As for now if you O/C your CPU you are better of disabling HT in order to gain a higher CPU clock.
 
Last edited:
Back
Top