# Gaming benchmarks: Core i7 3770 hyperthreading test (20 games tested)



## Artas1984 (Oct 1, 2015)

Hyper-threading as we know now (nevermind Pentium 4 era) in games has been tested since the day Nehalem Lynnfield Core I5 arrived. In this benchmark we saw a direct comparison between equally clocked I5 750 and I7 920 processors.

Techpowerup actually made similar reviews in the past, like this

These reviews, however contain too few games, so i am bringing this Core i7 VS Core i5 debate once again.

I own I7 3770 and i5 3570K processors, but in this BENCHMARK (not a CPU review mind you) i will simply use I7 3770 without HT as my reference Core i5. This is because I5 3570K has all of it's 4 core clocks at 3.8 GHz in turbo mode, while I7 3770 has only one core at 3.9 GHz in turbo mode, while other cores are lower. Also replacing processors takes time, and i have a very nasty cooler, that is very difficult to dismount...

20 games have been tested for this review. I am using 1920X1080 resolution, since this is the most popular mainstream resolution for mid-range cards like GTX760/GTX960, R9 270X/R9 370X and gamers do not play at 1024x768 rez. anyway despite the fact, that games are more VGA bound at 1920x1080 rez. Game settings are set to highest playable, so if a game does not score at least 40 FPS at highest settings, i lower the settings down. No AA is used in any game except Alan Wake, since AA can not be disabled in that game.


Test setup
Intel Core I7 3770
2X4 GB DDR3 1600 MHz C9
GeForce GTX760 OC 2 GB
Windows 7 Pro 64 bit
Forceware 355.81

For those who prefer video presentation

Let's begin.

Alan Wake American Nightmare (highest settings)





Batman Arkham Origins (highest settings)





Battlefield 4 (highest settings)





Bioshock Infinite Burial at Sea (highest settings)





Company of Heroes 2 (highest settings)





Crysis 3 (high settings)





Far Cry 3 (highest settings)





F.E.A.R. 3 (highest settings)





Formula 1 2013 (highest settings)





Hard Reset (highest settings)





Hitman Absolution (highest settings)





Lost Planet 2 (highest settings)





Max Payne 3 (highest settings)





Metro Last Light Redux (high settings)





Resident Evil 6 (highest settings)





Serious Sam 3 (highest settings)





Starcraft 2 Wings of Liberty (highest settings)





Syndicate (highest settings)





Watch Dogs (highest settings)





Witcher 3 (high settings)





I've made the benchmarks several times in a row and they are as real as you can get.

CONCLUSIONS

Hyper threading (HT) does not improve any notable performance in gaming on smooth gaming settings, despite scoring barely higher maximum frame rates in some games. In fact, HT decreases minimal frame rates by a smal margin in these games: Crysis 3, Metro Last Light Redux, Hard Reset - these were not just random scores, i've made countless attempts into the same pattern.

Now, this does not mean HT is fundamentally non existent in gaming - in low quality resolution and settings HT might just pull a notable lead, but since FPS will be way above 100 FPS, it will not really matter. What matters is saving money for the correct gaming PC. I've bought a used Core I7 3770 to replace my I5 3570K because i need HT for video editing, not gaming.


----------



## Ja.KooLit (Oct 1, 2015)

I believe that there is no significant gains with HT on. This is because alot of games is not optimized for multi core cpus. Thats where DX12 comes in. Hopefully, dx12 games will utilize all cores thereby alot of improvements with more cores or HT cpus.


----------



## Dent1 (Oct 1, 2015)

night.fox said:


> I believe that there is no significant gains with HT on. This is because alot of games is not optimized for multi core cpus. Thats where DX12 comes in. Hopefully, dx12 games will utilize all cores thereby alot of improvements with more cores or HT cpus.



Keep in mind HT doesn't constitute as multiple cores. It's a technique not a core.

A lot of the games above do support multi threading. I suspect most of these games are a few years old and the processors reviewed are at the very high end of performance. The games are probably reaching its frame rate limit regardless of whether HT is enabled or disabled. Basically these games are not stressing the CPU enough.


----------



## Schmuckley (Oct 1, 2015)

Nice review!


----------



## Ferrum Master (Oct 1, 2015)

well... so we have a strong argument... that lower-mid tier cards does not need i7, well yes... obviously... it would fun to see 970 and 380 as they are the mid tier really now.


----------



## FordGT90Concept (Oct 1, 2015)

There's a few outliers but for the most part, it seems to not matter either way.  The largest deviation (Metro Last Light Redux) in average FPS is 5.5%.

You should rerun Metro Last Light Redux with HTT on to double check those numbers aren't outliers.  If the numbers aren't up where they should be, run it again setting the process affinity to core 0, 2, 4, 6.  The problem might be that Metro Last Light Redux or Windows isn't load balancing well.


----------



## Aquinus (Oct 1, 2015)

The same tests with the 3770 set to only 2 cores enabled with and without HT would probably provide some more insightful numbers. It's not really new knowledge that games don't usually benefit from 8 logical threads.


----------



## Artas1984 (Oct 1, 2015)

Ferrum Master said:


> well... so we have a strong argument... that lower-mid tier cards does not need i7, well yes... obviously... it would fun to see 970 and 380 as they are the mid tier really now.



I have to stop this right here, cause you are misleading other people. There are graphics cards meant for 1920X1080 resolution and there are graphics cards meant for 2500x1440 resolution. GTX970 is meant for higher resolution than 1920X1080, it has been quoted like that by many reviewers, not me, so it would not not make any difference in my setup. The graphics cards i can test atm - GTX760, GTX670, GTX960 are all by now mid-range products meant for no more than 1920X1080 HD gaming. Those are almost equal in performance, and there is no need for me to get a GTX970 for my 24" monitor, unless i want to play Crysis 3 and Witcher 3 on ultra settings. Get it now?



Aquinus said:


> The same tests with the 3770 set to only 2 cores enabled with and without HT would probably provide some more insightful numbers. It's not really new knowledge that games don't usually benefit from 8 logical threads.



Nice idea. Perhaps i should do this.



FordGT90Concept said:


> There's a few outliers but for the most part, it seems to not matter either way.  The largest deviation (Metro Last Light Redux) in average FPS is 5.5%.



There is only one outliner in the review - Hard Reset max frame rates peaking over 230 FPS on both processors, which does not really matter.



FordGT90Concept said:


> You should rerun Metro Last Light Redux with HTT on to double check those numbers aren't outliers.  If the numbers aren't up where they should be, run it again setting the process affinity to core 0, 2, 4, 6.  The problem might be that Metro Last Light Redux or Windows isn't load balancing well.



I already ten times checked it. When i've scored these results i was very surprised too, that is why i did 10 Metro Last Light Redux tests in a row to be absolutely sure that those numbers are not just coincidence. 10 out 10 times the minimal FPS would not go above 49 with HT on.


----------



## R00kie (Oct 1, 2015)

You wanna see the difference? Get a higher end card, or an SLI setup, then you'll get your difference.


----------



## Ferrum Master (Oct 1, 2015)

Artas1984 said:


> I have to stop this right here, cause you are misleading other people. There are graphics cards meant for 1920X1080 resolution and there are graphics cards meant for 2500x1440 resolution. GTX970 is meant for higher resolution than 1920X1080, it has been quoted like that by many reviewers, not me, so it would not not make any difference in my setup. The graphics cards i can test atm - GTX760, GTX670, GTX960 are all by now mid-range products meant for no more than 1920X1080 HD gaming. Those are almost equal in performance, and there is no need for me to get a GTX970 for my 24" monitor, unless i want to play Crysis 3 and Witcher 3 on ultra settings. Get it now?



No I don't. Because it is low mid tier... it is meant for 1080p... just as 970... as 1080p is the major used resolution by everyone and 970 delivers good 1080p performance at max, and will start to struggle at 1440p, you have to tone down settings. ANNO 2015 970 is mid tier card, that's it.

There is no use to bench such entry level cards for being CPU bond... The CPU single thread is so mighty that it maxes out the card, it is 99% fed with data.


----------



## FordGT90Concept (Oct 1, 2015)

Artas1984 said:


> I already ten times checked it. When i've scored these results i was very surprised too, that is why i did 10 Metro Last Light Redux tests in a row to be absolutely sure that those numbers are not just coincidence. 10 out 10 times the minimal FPS would not go above 49 with HT on.


So try it using the CPU affinity as I said.  That would show whether Windows' scheduler is screwing up or if Metro somehow breaks SMT.


----------



## CounterZeus (Oct 1, 2015)

Ferrum Master said:


> No I don't. Because it is low mid tier... it is meant for 1080p... just as 970... as 1080p is the major used resolution by everyone and 970 delivers good 1080p performance at max, and will start to struggle at 1440p, you have to tone down settings. ANNO 2015 970 is mid tier card, that's it.
> 
> There is no use to bench such entry level cards for being CPU bond... The CPU single thread is so mighty that it maxes out the card, it is 99% fed with data.



I use gtx970 for 1440p with pretty much maxed out games. I would agree if you said 4K. GTX970 is not a mid range card, not even in 2015, nor it delivers just 'good' 1080p performance. This card is complete overkill for 1080p.


----------



## EarthDog (Oct 1, 2015)

So... where is the review? The thread starts on post #2??????


----------



## Artas1984 (Oct 1, 2015)

Ferrum Master said:


> No I don't. Because it is low mid tier... it is meant for 1080p... just as 970... as 1080p is the major used resolution by everyone and 970 delivers good 1080p performance at max, and will start to struggle at 1440p, you have to tone down settings. ANNO 2015 970 is mid tier card, that's it.
> 
> There is no use to bench such entry level cards for being CPU bond... The CPU single thread is so mighty that it maxes out the card, it is 99% fed with data.



GTX970 might be whatever tier you like Ferrum Master (if you compare it to Titan X - then sure it is a mid range card), but the fact is that the most popular gaming computers where are live right now are made of Core I5 Haswell processors, 2X4 GB RAM and GTX960 2 GB video cards. Makes sense to compare hyper-threading in a similar performance type computer, so that people know whether should they upgrade their Core i5 processors to Core i7 for gaming purposes and i think i've just made their decision quite a lot clearer...

Earthdog, i know you for quite some time now and while i appreciate your advises generally (for instance like not to use AMD CPU for gaming benchmarks among others in the past), your sarcasm sometimes is out of place. I am going to correct the word "review" into a "benchmark", because this obviously not a CPU review. If i wanted to do real full time reviews, i would probably had to change my profession from a med. biologist/technician to a computer technical,  PC service worker or a PC magazine reviewer. I do these hardware tests from time to time, why do you always pick on me and expect something "granduar"? Next time do not insult me publicly.


----------



## BiggieShady (Oct 1, 2015)

CounterZeus said:


> This card is complete overkill for 1080p.


Depends if you like your minimal frame rate over 60


----------



## Atomic77 (Oct 1, 2015)

What about the Intel Core i5 4570??? I have played a quite a few games with my setup and only having a Intel HD4600 integrated graphics but of course my settings are not very high.


----------



## Ferrum Master (Oct 1, 2015)

Artas1984 said:


> the fact is that the most popular gaming computers



The fact is based on what? STEAM,  the 970 is the second MOST popular used GPU in it's system... ? How to it call it then? The most popular top end card? Please be reasonable. And yes it is a MID tier card, just as manufacturer positioned it price and performance wise versus the top performing product the 980ti. No personal feelings, no philosophy, plains facts. It is the most used dedicated silicon today.



Artas1984 said:


> Next time do not insult me publicly.



Nobody is insulting you, please calm down. If you can't take proper reasoning and arguments, then keep silent. You are in between experienced people also, who also have enough professional experience in this area.



BiggieShady said:


> Depends if you like your minimal frame rate over 60



Yes agree... stutter is the thing, if the game lags in the 30ties at minimum the card is not enough simple as that. No stable 60FPS for FPS? The card is not enough, despite being only 1080p. Yes there is a solid argument that the games are dated. Even UT3 based Bioshock has horrid min FPS, and no CPU can change it. It lacks GPU horsepower, I won't touch Witcher 3 and GTA5 where we can toss endless power inside. Yes even your tests have custom settings, that bring the situation in incomparable manner with other tests. You loose up the needed compute data needed for the GPU, giving it need for less overhead ie CPU job actually. It is like everyone blames AMD for not using anisotropy, as it causes lower bench results.

And yes the mainstream user will complain just only about that! The game stutters, as it breaks game play. The suggestion,order building a PC for someone is based on such needs. The patient people that don't mind having FPS stutter are a rare breed really. Everyone expects after spending money, maximum - perfect result. So let us keep our professionalism with daily common sense about average user behaviour. 

So what should we think? Just gulp up the results and praise... yeah mate... pure biblical truth... or should we actually reason and bring out the real character of the graph, that actually the test won't change no matter what due to certain facts?


----------



## RejZoR (Oct 1, 2015)

Wait for DirectX 12 where number of threads will matter far more than it does today. Today, even dual core is enough, because they stuff everything on first two threads anyway...


----------



## Artas1984 (Oct 1, 2015)

Ferrum Master said:


> The fact is based on what?



Based on fact that the most popular gaming Core i5 computers in my country are being sold with a GTX960.  And the price range. GTX960 computers are 600-700 EU in shops. GTX970 computers are above 1000 EU. Different market for different people. You think every second person in the world owns a GTX970? People in this planet are not that rich.. You have a wrong impression, because we who are lurking inside such forums like techpowerup have some knowledge about what is what, but believe me - i know tons of people who don't even speak English, want to play Tanks online and are looking for a below 500 Euro PC build - GTX970 is not an option for them.

Nobody is insulting you, please calm down. If you can't take proper reasoning and arguments, then keep silent. You are in between experienced people also, who also have enough professional experience in this area.

I have a history with Earthdog, this is not your businesses and there is no need to calm me as i am calm. And yes, there are far more experienced people in this reviewing business than me, like Earthdog, that's why there was no point of him mocking my benchmark. Even though i am not your regular "computer fan case" - i repair notebooks since 2007 and have sufficient exp in hardware. Just just leave it and continue with the benchmarks.



Atomic77 said:


> What about the Intel Core i5 4570??? I have played a quite a few games with my setup and only having a Intel HD4600 integrated graphics but of course my settings are not very high.




So what about it? Seems like a perfect CPU for gaming.



FordGT90Concept said:


> You should rerun Metro Last Light Redux with HTT on to double check those numbers aren't outliers.  If the numbers aren't up where they should be, run it again setting the process affinity to core 0, 2, 4, 6.  The problem might be that Metro Last Light Redux or Windows isn't load balancing well.



There is no point in that. Obviously what is happening is that when HT is turned on, the main cores do not reach their full clock speed defined by turbo boost. When HT is turned off, the cores work to their maximum clock speed.


----------



## FordGT90Concept (Oct 1, 2015)

Artas1984 said:


> There is no point in that. Obviously what is happening is that when HT is turned on, the main cores do not reach their full clock speed defined by turbo boost. When HT is turned off, the cores work to their maximum clock speed.


If you're so convinced of that, you can test it as well by turning off turbo boost.

SMT has little to do with turbo boost; in fact, they have an inverse relationship: the more load on the processor, the more likely SMT is to improve performance and the less likely turbo boost will be enabled.


----------



## Ferrum Master (Oct 1, 2015)

Artas1984 said:


> Based on fact that the most popular gaming Core i5 computers in my country are being sold with a GTX960



Just chill mate. Don't take it personal. Get a beer. I gave you steam statistics and those are hard evidence. Those allmost 4% of 970 users allmost ~350 000 users. And that is a lot of people. And... If you sum up the 7970 front, that actually is more powerful than 960, despite being 3 years old going with castels... You will get a half a million of active gamers that actually mold the PC gaming front and demand.

If you cry about poor countries, please top mine. We may be poor overall, we have no spare money for defence budget, but we are hard skinned and have endured very hard times also. And those who do their job, just live fine and have the same desires for gaming and art as everyone else here. And I know the ambitions. I also am a hardware repair technician since 2003 and not only in PC hardware, so who cares. So leave the ambitions and epeen level out of here, I beg of you. Your personal clash with other members? Leave the grudge in online deathmatch, instagib.

I agree... 1K system is 970. But now it is a Skylake build? Not worth? The most of 970 owners are upgrade, or older ivy/beenwell owners. Their system cost especially on anniversary pentiums can be so cheap, so they use this delta to buy a better GPU. And I cannot blame them. The CPU dependancy has become weak. Only exception is GTA5. Other that matters are highly clocked two cores really. Just as many said.


----------



## Frag_Maniac (Oct 2, 2015)

Artas1984 said:


> I have to stop this right here, cause you are misleading other people. There are graphics cards meant for 1920X1080 resolution and there are graphics cards meant for 2500x1440 resolution. GTX970 is meant for higher resolution than 1920X1080, it has been quoted like that by many reviewers, not me, so it would not not make any difference in my setup. The graphics cards i can test atm - GTX760, GTX670, GTX960 are all by now mid-range products meant for no more than 1920X1080 HD gaming. Those are almost equal in performance, and there is no need for me to get a GTX970 for my 24" monitor, unless i want to play Crysis 3 and Witcher 3 on ultra settings. Get it now?



That's a bit harsh and inaccurate really. You're implying running anything more powerful than a midrange card at 1080p is pointless. Well, having an AMD card that's more powerful than any of those you mentioned, I can most definitely tell you it is in fact not. There's lots of games I don't run at max settings at 1080p, and with the cards you mentioned, it would be even more the case.

You're also basing your assessment on your size of display to justify your point of view. There are LOTS of people using much bigger than 24" displays now. I play on 32" at 1080p, but that is dwarfed by what a lot of people play on. Even in high end monitors, 27" is quickly becoming the new norm.

Tests like this really need be done with a range of hardware to get an accurate idea, but HT is always been one of those things that is mostly irrelevant to even bother testing because so few games actually make use of it. The last person that argued with me that HT can make a big difference only talked of Crysis 3, and he dropped res from 1080p to 720p to do so, making it a pretty silly statement really.


----------



## CounterZeus (Oct 2, 2015)

BiggieShady said:


> Depends if you like your minimal frame rate over 60



Well, I'm going to play happy on 1440p and let you play at a mere 1080p (a resolution we've been stuck with for way too long) with a 350 euro 'mid range' card.

There will always be new future proof games or terribly coded ones that will kick you in the nuts at pretty much every resolution. And yes the longer you wait, the more valid your statement becomes. For example, I was playing every game at 1080p with a 9600GT all those years ago. Now that card is obsolete. In another year, I might have to drop some eye candy to get stable fps on 1440p with a gtx970.


----------



## GreiverBlade (Oct 2, 2015)

Very nice review, that comfort me in my 6600K choice over a 6700K.

Thanks!


----------



## RCoon (Oct 2, 2015)

I don't understand why you turned off AA in every game for your benchmarks.

Also, OCAholic did it with separate processors as opposed to disabling HT: http://www.ocaholic.ch/modules/smartsection/item.php?itemid=1061&page=13

Similar results to yours, but thats to be expected. Almost no game uses more than four cores. The only people who buy i7's instead of i5's are usually those that need them for different reasons other than gaming.


----------



## FreedomEclipse (Oct 2, 2015)

Should have added total rome to the benchmarks. That game is a lot more cpu heavy than any of the games listed important


----------



## EarthDog (Oct 2, 2015)

Again, I can't see this review. The thread starts at post #2... what am I doing wrong?


----------



## 64K (Oct 2, 2015)

EarthDog said:


> Again, I can't see this review. The thread starts at post #2... what am I doing wrong?



Do you have Artas1984 on ignore? I've never put anyone on ignore so I'm not sure how that works for someone starting a thread that is on ignore followed by members responses that aren't on ignore.


----------



## AsRock (Oct 2, 2015)

Pretty much old news, but still thanks for all the effort and time.


----------



## Artas1984 (Oct 2, 2015)

Wait till November and i will remake these benchmarks with GTX970, ok? Same 1920x1080 resolution   That should clear up things a bit more for those who have doubts..



RCoon said:


> I don't understand why you turned off AA in every game for your benchmarks.



I use AA mostly when i test video card benchmarks, it's pixel fill-rate and memory bandwidth dependent thing, i would only be chocking my video card more and the HT results would be even more inacurate.


----------



## rtwjunkie (Oct 2, 2015)

CounterZeus said:


> I use gtx970 for 1440p with pretty much maxed out games. I would agree if you said 4K. GTX970 is not a mid range card, not even in 2015, nor it delivers just 'good' 1080p performance. This card is complete overkill for 1080p.


 
Ummm, you may like it's performance (I notice you didn't say it could completely max out 1440p), but that doesn't change the fact that within the Maxwell hierachy, a GTX 970 is a mid-tier card.

What matters is where it is positioned in the product stack, not how much it costs in another country and whether it is affordable or not.  It has 2 above and 2 below.  It is actually the perfect 1080p card, because there are games that it cannot max out at 1080p. 
Yeah, it's a firmly mid-tier card.


----------



## EarthDog (Oct 2, 2015)

64K said:


> Do you have Artas1984 on ignore? I've never put anyone on ignore so I'm not sure how that works for someone starting a thread that is on ignore followed by members responses that aren't on ignore.


ROFLMAO.. hahaha, that's it...


----------



## GreiverBlade (Oct 2, 2015)

RCoon said:


> I don't understand why you turned off AA in every game for your benchmarks.


well for my point of view it's plenty valid, i never use AA (pointless, specially in 1323 DSR ) but even in 1080p i rarely have a game that looks better or worse, with or without it  (yes i have a case of "hyperopic astigmatism" let say i have a biologic AA feature...)



RCoon said:


> The only people who buy i7's instead of i5's are usually those that need them for different reasons other than gaming.


totally true ... or peoples who don't care about 100-120 bucks more for  "just" having more L2, mhz and HT



rtwjunkie said:


> Ummm, you may like it's performance (I notice you didn't say it could completely max out 1440p), but that doesn't change the fact that within the Maxwell hierachy, a GTX 970 is a mid-tier card.
> 
> What matters is where it is positioned in the product stack, not how much it costs in another country and whether it is affordable or not.  It has 2 above and 2 below.  It is actually the perfect 1080p card, because there are games that it cannot max out at 1080p.
> Yeah, it's a firmly mid-tier card.


nahhh it's a high low end ... (higher tier of the low end...) the mid tier is the 980   (joke joke)


----------



## RejZoR (Oct 2, 2015)

I can't use DSR because this stupid thing forces games to run at 60Hz and that looks absolutely horrible on a 144Hz monitor.

Frankly, it's easier to run 1080p with FXAA. Almost no performance hit and it still looks nice and can run at 144Hz.


----------



## terroralpha (Oct 2, 2015)

while I definitely appreciate the time and effort (mostly the time) put into conducting the tests and posting the results, but I don't see why this is even a debate. CPUs haven't been a bottleneck in gaming performance for a few years now. even if you have an i5 2500K, which will turn 5 years old in January, you are still set for any game that is out now or will come along anytime soon. i didn't get a 6 core/12 thread CPU because I thought it would help my gaming. I KNEW that it would not. i needed it for other reasons.


----------



## rtwjunkie (Oct 2, 2015)

terroralpha said:


> while I definitely appreciate the time and effort (mostly the time) put into conducting the tests and posting the results, but I don't see why this is even a debate. CPUs haven't been a bottleneck in gaming performance for a few years now. even if you have an i5 2500K, which will turn 5 years old in January, you are still set for any game that is out now or will come along anytime soon. i didn't get a 6 core/12 thread CPU because I thought it would help my gaming. I KNEW that it would not. i needed it for other reasons.



Hell, even some i3's (i3-4160 for one) are perfectly adequate in most games! (see signature block)


----------



## GreiverBlade (Oct 2, 2015)

rtwjunkie said:


> Hell, even some i3's (i3-4160 for one) are perfectly adequate in most games! (see signature block)


or my Alpha .... i3-4130T 2x2.9ghz +HT... pretty much fine in any game i throw on it ... even World Of Warships, it need some tweaks but i can run perma 60fps (tho the network lag is a bit different ... but i am not at home ... rather far from it, atm  )


----------



## Frag_Maniac (Oct 2, 2015)

On the i5 vs i7 issue, some buy the i7s for reasons other than just HT for heavily threaded apps. You also get higher clock speed, and some prefer not to OC, or just want to be assured that guaranteed speed. Not all chips OC well, and lately they're setting the speeds on i7s quite a bit higher. It's part of the selling point. If they only added HT, they wouldn't sell nearly as many, esp to gamers.


----------



## Artas1984 (Oct 3, 2015)

*IMPORTANT UPDATE!!!*

As per request by some members, i have redone all the benchmarks at lowest settings on 1024x768 resolution in order to disable possible GPU bottleneck.





As you can see, the results speak basically of the same pattern as in the highest settings benchmark. The only small difference might be Hitman Absolution slightly benefiting from HT.

However, notice that Batman Arkham Origins, Bioshock Infinite, Far Cry 3, Metro Last Light Redux suffered a penalty in minimal frame rates with HT on!!!

I now i have really proved that HT is worthless in gaming. If you did not believe me before, perhaps you do now?


----------



## Frag_Maniac (Oct 3, 2015)

Artas1984 said:


> I now i have really proved that HT is worthless in gaming. If you did not believe me before, perhaps you do now?



All you've done is verify what most already know. What compelled you to even try and prove HT is worthless in gaming? Most that hang out on tech forums with intelligent minded gamers are well aware of it.

I can think of better things to do with one's time.


----------



## Aquinus (Oct 5, 2015)

Frag Maniac said:


> All you've done is verify what most already know. What compelled you to even try and prove HT is worthless in gaming? Most that hang out on tech forums with intelligent minded gamers are well aware of it.
> 
> I can think of better things to do with one's time.


I suspect hyper-threading is more useful for games on dual-core CPUs.


----------



## GreiverBlade (Oct 5, 2015)

Aquinus said:


> I suspect hyper-threading is more useful for games on dual-core CPUs.


Yep I saw that with my 4130T


----------



## RCoon (Oct 5, 2015)

Artas1984 said:


> I now i have really proved that HT is worthless in gaming.



*Unless you're on a multi-GPU setup
*Unless you're streaming
*Unless you do more things than just gaming

Your graphs show small percentage increases in framerates in some titles. So while the results may not be worth it to you, there are still gains to be had for those that do care.

Sweeping statements man, don't do them.

You know what's handy for me? Being able to play games whilst rendering a video for two hours!

EDIT: May I ask if you used the 95th percentile for your results? Most of us do to eliminate spikes in results.


----------



## Vayra86 (Oct 5, 2015)

*When it comes to the i5 versus i7 and just for gaming, there is no reason whatsoever to get the i7.*

That is the blanket statement you can actually make. Beyond that, any blanket statement is dangerous.

People who use an i3 with HT will find tangible gains because the two main threads can be more fully utilized for the game itself because the CPU can offload the less intensive threads (background tasks) to HT. Also, newer engines can make better use of HT, some engines run less intensive threads that can be scheduled to HT.

Like others pointed out, there are lots of users that record while gaming. At that point the i7 is no luxury, although if you use Shadowplay the i5 will suffice still, but this is proprietary for Nvidia cards. So again, no basis for saying the i5 is enough.

About the 970 and 'mid range'. The primary advantage of 970 is in newest titles at 1080p, if you look at anything before 2014, you will see the 970 has no real merit at 1080p. Look at TW3: turn off Hairworks, and I run that 'heavy game' on my mid range GTX 770 with relative ease at high settings. No urge to upgrade until you turn that performance hog 'on'. Just for a little bit of perspective right there on what is 'needed to run'...

Price wise, the 970 is not mid range but borders the high end. The fact that we have big chips now that are part of the yearly refreshes with an astronomical price tag, does not change the playing field of mid range. Mid range, to me, is still the 250 eur maximum and if ANYTHING happened to mid range territory, it is actually the other way around: getting a mid range card has become *cheaper* because gaming needs have not advanced all that much, evidenced by the re-re-refresh of AMD's cards. The 670 was also launched as a high end card, and the 690 was the dual gpu version of that high end card. Let's keep calling apples apples, and not change them to oranges because Nvidia and AMD have upped their pricing game and because a couple % of the gaming market likes to lead the 'master race' with a lot of fanfare about their newest acquisitions. 1440p is still not an argument in terms of being mid range, because it simply is not mid range but high end. 1080p is mid range, and for that, you only need a 970 for the newest of games at the highest settings. People on this forum sometimes forget Ultra settings have never been mid range territory, and neither has 60 fps ever been mid range. Mid range is ultra @ 30-45+ or medium/high @ 60 fps.


----------



## uuuaaaaaa (Oct 5, 2015)

If i recall correctly back in the days Enemy territory quake wars had pretty good multi threading support. Rage (another game using an engine from id, id tech 5) makes heavy used of multi-threading (afaik) to deal with all the megatexture stuff and so on. I would love to see how rage scales, if possible...


----------



## rtwjunkie (Oct 5, 2015)

@Vayra86 overall I agree with you, except for the 970.  As I said previously, how many Euros is costs does not dictate if it's mid range.  But if you want to go there, it's 350 and below U.S. Dollars, which is very affordable no matter WHAT year it is.    It may be a higherend of the mid tier, but it is definately a mid-tier card.  

It has the 980 and 980 Ti above it, and 960 and 950 below it.  And it struggles to play all games at high settings at 1080p.  Mostly however it gets that part perfect.  You also have to look at the chip used. OMG...it's not the GM200!  Knowing Nvidia's naming scheme is crucial to see right away it's mid-tier.  Notice nowhere did I say it sucks.  It's a very good card, but that doesn't position it higher than the middle of the pack.  Edited


----------



## Vayra86 (Oct 5, 2015)

rtwjunkie said:


> @Vayra86 overall I agree with you, except for the 970.  As I said previously, how many Euros is costs does not dictate if it's mid range.  But if you want to go there, it's 350 and below U.S. Dollars, which is very affordable no matter WHAT year it is.    It may be a higherend of the mid tier, but it is definately a mid-tier card.
> 
> It has the 980 and 980 Ti above it, and 960 and 950 below it.  And it struggles to play all games at high settings at 1080p.  Notice nowhere did I say it sucks.  It's a very good card, but that doesn't position it higher than the middle of the pack.



Still it is the wrong way to go about it. The 670 launched as high end, the 970 launched as high end. Remember back when it launched how people were saying 'this will be succeeded by a big chip'? It is an exact repeat of the Kepler release scheme, which touted 670 as high end, and that is exactly what it was. And just like today, people frowned upon the 680/980 for its astronomical price difference with only a very small performance advantage. Affordability wise, 350 dollars is a pretty serious investment for a GPU. You buy a console for that. Our 'bottom line' has shifted, let's keep that in mind, but the market still works in the exact same way as it has always done.


----------



## rtwjunkie (Oct 5, 2015)

No, there were overpriced video cards way back....8800GTX anyone?  Noone touted the 670 as high end.  Everyone who actually knows how Nvidia numbers their chips knows what their designations mean and what tier they fall into.


----------



## Vayra86 (Oct 5, 2015)

rtwjunkie said:


> No, there were overpriced video cards way back....8800GTX anyone?  Noone touted the 670 as high end.  Everyone who actually knows how Nvidia numbers their chips knows what their designations mean and what tier they fall into.



By that analogy, 7970 has also never been a high end chip. ???

Let's get some facts. http://www.anandtech.com/show/5818/nvidia-geforce-gtx-670-review-feat-evga

Introduction to this review:
In a *typical high-end GPU launch* we’ll see the process take place in phases over a couple of months if not longer. The new GPU will be launched in the form of one or two single-GPU cards, with additional cards coming to market in the following months and culminating in the launch of a *dual-GPU behemoth (read: GTX 690)*. This is the typical process as it allows manufacturers and board partners time to increase production, stockpile chips, and work on custom designs."

My point stands. The introduction of a bigger chip does not push the GK104 to a lower price tier. The big chip created its own price tier. When it comes to current mid range, that means we are talking about 960 or 770. Not 970/980.

Kepler introduced the GK106, aka GTX 660, which was true mid range at the time, and 960ti was also the mid range card above that. That's where mid range stops. Back in the day there was nothing better than 680. So early adopters paid high end prices for a mid range card back in the day. That makes sense!  We all know 680 was only leading the 670 by a very small margin, hence everyone bought a 670. And what happened today with 970 and 980? Exactly.


----------



## rtwjunkie (Oct 5, 2015)

However, just because something leads the naming scheme and price does not mean that it is a high end card.  Everyone who knew anything about which CHIP was used knew that the 680 was not a true high end card.  It was merely the highest performing of that series.  Same with 970 and 980...when released, again, top performers of 9 series, but done using their mid-tier chip.  So my point stands too.


----------



## Vayra86 (Oct 5, 2015)

rtwjunkie said:


> However, just because something leads the naming scheme and price does not mean that it is a high end card.  Everyone who knew anything about which CHIP was used knew that the 680 was not a true high end card.  It was merely the highest performing of that series.  Same with 970 and 980...when released, again, top performers of 9 series, but done using their mid-tier chip.  So my point stands too.



I think you fail to make the distinction beyond high end, which is the 'enthusiast segment'. You underlined this yourself: 970 runs everything at max at the current most used resolution. That is a definition of high end for gaming. Going beyond that basic res has always been a thing for PC gaming, and we call those guys 'enthusiasts'. The fact that you can do this on a single panel today does not change that fact either. 1440p panels are still pricy, and the grunt needed to max it out on that res is still significant. Back when 670/680 released, enthusiast segment was the dual-gpu solution. And then, we got the Titan class cards of which everyone agreed it was a huge cash grab for the ultra rich. Now you say 'this is the high end' but the reality is, if that is really high end, it is unreachable for everyone but a mere 5% of all gamers. It is unrealistic and completely out of place to say that thát comprises the entire high end market.

Another analogy: CPU. Today the marketing slides tell us we need an X99 board for gaming, because, you know, this is the real thing to have. Does that make it high end all of a sudden, when before any kind of E-chipset was purely workstation oriented? NO. If a gamer gets X99 he is an enthusiast, he runs multi-GPU or he runs a big chip, or both. Did the marketing of X99 change anything about the positioning of the 'normal' i5 or i7? Look at Skylake prices and you have your answer. It didn't. The i5 is still the mid range CPU, the i7 is still high end, and the E-proc is still enthusiast or workstation. The same thing happened for GPU with the launch of big chips as an integral part of the naming scheme.

You are letting the marketing take a run with your wallet and your mind if you tout only the big chips as high end. Also, the first sentence of your previous reply... do you realise how silly it is what you are saying there??? You are saying flagship releases, which 680 has been for a LONG time, are not high end releases.... 690, which had a price tag of 1000 euro, in your mind was made up of two mid-range cards. I know its monday morning, but really? Our tech-savvy reality is not the market's reality. Here at TPU we are lining up for the newest of cards, so our bottom line has shifted to a higher price point. It is good to keep that in mind.


----------



## rtwjunkie (Oct 5, 2015)

Lol, yes really.  There is nothing silly.  If your memory will serve you correctly, many people decried Nvidia releasing the 680 as their flagship, knowing that Nvidia deliberately gave us their mid-range chip.

As to whether the 970 can max all settings at normal resolution, which is 1080p, it can't.  And I made that distinction earlier in the thread as well.  In fact in my house, I found I could outperform a 970 with a 780 at 1080p over 70% of the time.

Please remember, how much a product costs and where it falls on the naming scheme has no relation to which chip is used. Nvidia themselves would tell you GM 204 is their mid-tier chip, GM 206 their lower end chip, and GM200 the high-end chip.  See how that works? 

You have merely been confused by marketing, based on the fact that such high performance was wrung out of the GM 204.


----------



## Vayra86 (Oct 5, 2015)

Fine, let's agree to disagree, I guess.

We are pretty off topic anyway  Let's call it here


----------



## rtwjunkie (Oct 5, 2015)

Vayra86 said:


> Fine, let's agree to disagree, I guess.
> 
> We are pretty off topic anyway  Let's call it here


 
Sounds good!   Having differing opinions and debating, then agreeing to disagree is a wonderful thing.  Happy Monday!


----------

