# Gaming benchmarks: Core i7 3770K 4.5 GHz vs Core i7 6700K 4.5 GHz



## Artas1984 (Jul 30, 2017)

Let's say you have a Core i5 2500K and an overclock-able S1155 motherboard: P67, Z68, Z75, Z77 kind. You also have a GeForce GTX680 or Radeon HD7970, but you want to play games with your brand new 120/144 Hz gaming monitor at 1920x1080 resolution getting stable 120/144 FPS . However your PC can barely push 60 FPS in demanding games at best of times. First obvious choose would be to get something like a GeForce GTX1080, but then comes the dilemma with CPU: sell the whole PC and get a new Core i7 6700K/DDR4 system or just upgrade your Core i5 2500K to Core i7 3770K? The Ivy Bridge Core i7 3770K is the best CPU you can get, and having in mind your motherboard can overclock it, would it be worth keeping it istead of buying a new Core i7 6700K system? This question has been asked many times, and there are already many answered tests to this situation, but i myself always like to test things myself, so here it goes.












TEST SETUP

Intel Core i7 6700K 4.5 GHz OC
Asus Maximus 8 Ranger
Kingston 2X8 GB DDR4 2133 MHz CL14
Evga GeForce GTX1080 SC Gaming 8 GB (1868 MHz boost clock)
Windows 10 X64
NVIDIA Forceware 384.94 (DirectX 11 used in games)
------------------------------------------------------------
Intel Core i7 3770K 4.5 GHz OC (4550 MHz in gaming)
Asus Sabbertooth Z77
Crucial Ballistix Tactical 2X8 GB DDR3 1600 MHz CL7
Evga GeForce GTX1080 SC Gaming 8 GB (1868 MHz boost clock)
Windows 7 Pro X64
NVIDIA Forceware 384.94

I overclocked both processors to 4.5 GHz, since in that way you can simulate Core i7 7700K performance, since it reaches 4.5 GHz with just turbo boost and no OC. As you can see i used "basic" frequency RAM for both processors. 1600 MHz is the common "starting" frequency of DDR3 for Core i7 3770K and 2133 MHz is the common "starting" frequency of DDR4 for Core i7 6700K. However, i do have to admit that Core i7 3770K benefits much more in this situation than Core i7 6700K, because this Crucial Ballistix Tactical 1600 MHz {7-7-7-20} DDR3 is one of the best RAM you get. I tested it versus Corsair Vengeance Pro 2400 MHz {11-13-13-31} DDR3 and Crucial turned out to be the faster RAM!

If you would like to know the difference in gaming between DDR4 2133 MHz vs DDR4 3000 MHz, open this right now:

*Gaming benchmarks: DDR4 2133 MHz VS DDR4 3000 MHz (Core i7 6700K)*

So now that you know how much Core i7 6700K would gain by having faster RAM, let's get on with the benchmarks!

On both computers the games were tested with Fraps , using 3 or 4 same gameplay sequences, 15 seconds of benching, and averaging the result from those 3 sheets to 1 sheet. To minimize GPU load, i only did 1080P tests and no AA was used in any game. Having in mind GeForce GTX1080 has more than enough horses to pull out games at 1080P, this should ensure a fair battle between Core i7 3770K and Core i7 6700K.

LET'S BEGIN!










*ASSASSIN'S CREED SYNDICATE*






A surprising start for Core i7 3770K equaling Core i7 6700K. 3000 MHz DDR4 would add 1-2 additional FPS for Core i7 6700K.

*BATTLEFIELD 1*






This is what you should expect typically. 3000 MHz DDR4 would add 2-3 additional FPS for Core i7 6700K.
*
CALL OF DUTY BLACK OPS 3*






A win for Core i7 3770K definitely comes from great Crucial RAM latency. 3000 MHz DDR4 would add 3-4 additional FPS for Core i7 6700K, thus equaling the processors. For this i ran 6 same sequences for each CPU.

*DYING LIGHT*






While not much of an average FPS difference, the minimal FPS definitely cementifies Core i7 6700K superiority. 3000 MHz DDR4 would add 1-2 additional FPS for Core i7 6700K.

*FAR CRY 4*






Expected results in favor of Core i7 6700K. 3000 MHz DDR4 would add 1-2 additional FPS for Core i7 6700K.

*MAD MAX FURY ROAD*






With so much FPS on the display the small victory for Core i7 3770K is not that important. 3000 MHz DDR4 would add 1-2 additional FPS for Core i7 6700K, so Core i7 3770K would still top the chart.

*METRO LAST LIGHT REDUX*






What's up with this game liking older processors? Core i5 2500K defeated Core i5 6500 EASY in this game and here Core i7 3770K is superior to Core i7 6700K. Bizarre, i have no words to explain... 3000 MHz DDR4 would surely help in maximum FPS for Core i7 6700K, but not decisively in average or minimal...

*MIDDLE EARTH SHADOW OF MORDOR*






This is the only in-game benchmark used for this CPU battle. The max FPS are not important at all, since it's in the random variation of +/- 10 %. However, the minimal FPS are more less constant, thus making Core i7 6700K a no-doubt victor. 3000 MHz DDR4 would add 1-2 additional FPS for Core i7 6700K.

*PREY*






Standard victory for Core i7 6700K. No data on 3000 MHz DDR4 performance...

*PROJECT CARS*






That was shocking! To add salt on the flesh, 3000 MHz DDR4 would add a staggering 20-25 FPS boost for Core i7 6700K! This has to be the most memory bandwidth dependent game i have ever seen, nevermind the latencies...

*RISE OF TOMB RAIDER*






Amazing stuff! This means if you can get the best RAM for your Core i7 3770K, overclock it to Core i7 6700K levels you get a computer that matches a 3 year newer computer frame to frame in games! 3000 MHz DDR4 would add 4-5 additional FPS for Core i7 6700K, thus equaling the processors.
*
WITCHER 3 WILD HUNT*






This pretty much sums up the whole point of not upgrading your Core i7 3770K CPU and the whole system to Core i7 6700K system. This does not count if you have a piece of shit H61 motherboard. 3000 MHz DDR4 also adds no benefit in this game.

------------------------------------------------------------------------------
*Please note that whenever i say that DDR4 3000 MHz adds minimal FPS improvements over DDR4 2133 MHz, i am only talking about standard situations where frames are being drawn constantly. I do not conclude this statement on processing demanding scenes, in those, FPS can benefit more from higher memory bandwidth. 
*
I will draw no performance summary this time, since the results were so random and unexpected. I knew that there will be minimal difference between Core i7 3770K and Core i7 6700K clocked at the same speed with the same amount of RAM, but i surely expected Core i7 6700K to win in every game. The fact that Core i7 3770K actually managed to beat Core i7 6700K in some games, all be it by little, was a big nice shock.

Comments are welcome as always.


----------



## Nuckles56 (Jul 31, 2017)

That was quite interesting, how many runs did you do for each CPU and game?


----------



## hapkiman (Jul 31, 2017)

Great job, you obviously put a lot of work into this.  However, I have one point of contention, or maybe just an observation and question.  Do you not feel there would be a fundamental difference discounting hardware differences simply by using Windows 7 on one rig, and Windows 10 on the other?

Again though, excellent work.  Very interesting and a good argument for keeping those "old" dusty rigs a while longer.


----------



## Artas1984 (Jul 31, 2017)

Nuckles56 said:


> That was quite interesting, how many runs did you do for each CPU and game?



I've written it from the start - 3 to 4 runs.



hapkiman said:


> Great job, you obviously put a lot of work into this.  However, I have one point of contention, or maybe just an observation and question.  Do you not feel there would be a fundamental difference discounting hardware differences simply by using Windows 7 on one rig, and Windows 10 on the other?



Thank's. Perhaps someone else would be willing to answer that, i am not sure even what to think. I mean, sure, Windows 7 and Windows 10 have quite a bit different performance, but i tried to tie the specs and drivers of both computers similar. No DX12 API was used for any game in Windows 10.




hapkiman said:


> Very interesting and a good argument for keeping those "old" dusty rigs a while longer.



Not much of dusty if you love your PC and keep it clean.






Only CPU and MB are old.


----------



## Melvis (Jul 31, 2017)

Where is my personal Message to say this was up?  

Great work as always and I like how you used both OS's for this test.

On a side note, get rid of AVG  I have found it to slow system performance. Kaspersky has a free AV now


----------



## Vario (Aug 1, 2017)

Thanks for this post.  It should save some money for gamer focused forum members looking to upgrade from 1155:  clearly no reason to.  Nice work!


----------



## gr33nbits (Aug 6, 2017)

Good work and yes i tell this to all my freinds if you have a good 3th or 4th Intel generation cpu don't bother with the swiming on the new lakes.


----------



## EarthDog (Aug 6, 2017)

Thanks for taking the time!

"Basic frequency" is the platform's JEDEC specs.

Id be interested to see the results using the same os. Though it may not make much of a difference, thats a variable which shouldnt be introduced.  

Id also be interested in seeing results of actual game settings on the gpu (read: ultra with AA) instead of artificially deflated settings which potentially exaggerate differences. People with a 1080 run with ultra settings and aa at 1080p. Again, likely not a big difference... but a variable that shouldnt be added.


----------



## Artas1984 (Aug 11, 2017)

Youtube video now included. First post.



EarthDog said:


> Id be interested to see the results using the same os. Though it may not make much of a difference, thats a variable which shouldnt be introduced.
> 
> Id also be interested in seeing results of actual game settings on the gpu (read: ultra with AA) instead of artificially deflated settings which potentially exaggerate differences. People with a 1080 run with ultra settings and aa at 1080p. Again, likely not a big difference... but a variable that shouldnt be added.



I'd be interested too. Perhaps someone *ELSE *will do that stuff. I am not interested in showing any GPU settings. This is not a GPU battle. And though i agree that AA should be used when gaming at 1920x1080, it would only decrease the difference between processors and increase GPU load, so the benchmarks would have been irrelevant, and we would see no difference between Core I7 3770K and Core I7 6700K.


----------



## gr33nbits (Aug 11, 2017)

For me even people with cpu's like I7 2600k or even i5 2xxx are up to date until covfefe launch and more cores will make a diference but that's just normal, Sandy Bridge is still kicking arses all over the world.

I updated 3 weeks ago so went Ryzen and don't regret it, my 1st ever AMD Desktop CPU.


----------



## peche (Aug 11, 2017)

i need a 3770K chip for decent price... l


----------



## revin (Aug 11, 2017)

Love the comparison great job
I'm going to see if the upgrade to an Intel Z77 Extreme I got last month will make any difference from my DZ68  board. Will be interesting at least if it will still clock 5.0-5.2 as well to see how those killer LP Samsung's fair Even tho Intel said they were Not Z68 compatible


----------



## WhiteNoise (Aug 11, 2017)

gr33nbits said:


> For me even people with cpu's like I7 2600k or even i5 2xxx are up to date until covfefe launch and more cores will make a diference but that's just normal, Sandy Bridge is still kicking arses all over the world.
> 
> I updated 3 weeks ago so went Ryzen and don't regret it, my 1st ever AMD Desktop CPU.



I have a i5 2500K in another PC at home and clocked at 4.5GHz with a good graphics card plays any game out to date well. Thats been a fantastic cpu.


----------



## EarthDog (Aug 11, 2017)

Artas1984 said:


> Youtube video now included. First post.
> 
> 
> 
> I'd be interested too. Perhaps someone *ELSE *will do that stuff. I am not interested in showing any GPU settings. This is not a GPU battle. And though i agree that AA should be used when gaming at 1920x1080, *it would only decrease the difference between processors and increase GPU load, so the benchmarks would have been irrelevant, and we would see no difference between Core I7 3770K and Core I7 6700K.*


Yeah... that was the point, actually. 

Where people play games,  there is less of a difference or perhaps none at all. I suppose I just don't understand the point of making an unrealistic testing environment, to bring out a result not normally there with settings which are normally run.


----------



## Frito11 (Aug 11, 2017)

i can tell you for sure that i5-2500k's will choke in BF1 mutliplayer, campaign is much easier on the CPU for some reason, i did some testing after i got my 6700k when i still had a 980 and even the 6700k at 4.7 ghz with hyperthreading off slightly hurt my FPS and caused drops in GPU use. overclocked i7 Sandy's and up should be ok though its the 4 threads vs 8 in that particular game that seems to be the real breaker, even 8 core FX cpus overclocked do better than vanilla 4 cores.  at this point if Coffee Lake is a good overclocker its probably a great time for upgrading off old platforms end of this year with both that and the Ryzen options, games are starting to justify the need for more than 4 cores for sure but it will take a long time till its the norm just like it did with 4 cores becoming the norm.


----------



## Artas1984 (Aug 11, 2017)

revin said:


> Love the comparison great job
> I'm going to see if the upgrade to an Intel Z77 Extreme I got last month will make any difference from my DZ68  board. Will be interesting at least if it will still clock 5.0-5.2 as well to see how those killer LP Samsung's fair Even tho Intel said they were Not Z68 compatible



Perhaps i have missed your previous posts, but what kind of balls are you smoking with Core i7 2600K 5 GHz at stock voltage?


----------



## EarthDog (Aug 11, 2017)

I've seen it before... 1.25V give or take. 

Def. a good chip though.


----------



## lyndonguitar (Aug 11, 2017)

Thanks for this benchmark!! This just justifies my decision even more not to upgrade yet. Coming from an i7-2600k@4.5Ghz user, ever since 4th gen came out I've been thinking of upgrading to a new CPU. I'd always say: "NOPE, too expensive"... DDR4 came out, still nope. I switched between 4 different cards up until my current GTX 1070, and still nope. 

To upgrade my CPU right now I'd have to change my mobo, and RAM as well. not to mention assembling and installing the system plus selling the old parts and spending a lot of cash. not worth the few FPS increase imo. I haven't even maxed my GPU yet. I could upgrade to a GTX 1080 at a lower price and still get better FPS than I'd get if I upgrade my CPU instead.


----------



## Toothless (Aug 12, 2017)

Artas1984 said:


> Perhaps i have missed your previous posts, but what kind of balls are you smoking with Core i7 2600K 5 GHz at stock voltage?


I've seen a few of those at 5ghz. Does a good job and I think we have a member here that has one that'll hit 5ghz.


----------



## EarthDog (Aug 12, 2017)

Plenty go to 5Ghz+... on air. If it didn't, it was below average.

The sticky point here is actually the 'stock' voltage claim. I think mine was 5Ghz 1.275V actual (from MM and voltage read point). I was 5.3Ghz before the multiplier crapped out and 5.4xx with BCLK. That was on custom 3x120 water.


----------



## revin (Aug 12, 2017)

Artas1984 said:


> Perhaps i have missed your previous posts, but what kind of balls are you smoking with Core i7 2600K 5 GHz at stock voltage?


There's plenty of screenies from waay back testing the Samsung ram with Dave @102 Bclk 
It's been running like this for years since they came out.
And yes it on a modded old skt478 Scythe Ninja Rev.B cooler with 1 old fan


----------



## Assimilator (Aug 13, 2017)

The fact that you keep swapping the 6700K and 3770K in your charts, depending on which one wins, is pretty confusing. I looked at the first three games in the list, where 6700K won and thus was on top, then went to Fury Road where the 3770K won, and if I hadn't read your comment I would've assumed the top graph was for 6700K as well, not 3770K.

As for Metro Last Light, I'm guessing that game engine is more sensitive to DRAM latency as opposed to bandwidth.


----------



## Artas1984 (Aug 13, 2017)

Assimilator said:


> The fact that you keep swapping the 6700K and 3770K in your charts, depending on which one wins, is pretty confusing. I looked at the first three games in the list, where 6700K won and thus was on top, then went to Fury Road where the 3770K won, and if I hadn't read your comment I would've assumed the top graph was for 6700K as well, not 3770K.



Some forum members recommended me to do graphs THIS way and i agreed with them. People expect the winners to chart the top.


----------



## Artas1984 (Aug 13, 2017)

revin said:


> There's plenty of screenies from waay back testing the Samsung ram with Dave @102 Bclk
> It's been running like this for years since they came out.
> And yes it on a modded old skt478 Scythe Ninja Rev.B cooler with 1 old fan



Wait. If i remember the default core voltage is surely higher than 1 Volt. WTH is going on?


----------



## gr33nbits (Aug 13, 2017)

1.0v vcore for 4.9 on those cpus really? I never had one have no idea.


----------



## Artas1984 (Aug 13, 2017)

gr33nbits said:


> 1.0v vcore for 4.9 on those cpus really? I never had one have no idea.



Well last i remember getting 4.9 GHz on a chip like that, i would use 1.35 V, and it was not even stable...


----------



## Vayra86 (Aug 13, 2017)

Great comparison, and great confirmation that sticking to my Ivy for a while longer is *just fine*.



gr33nbits said:


> 1.0v vcore for 4.9 on those cpus really? I never had one have no idea.



In the screenshot I see the CPU idling at 0% so that explains it?

There is no way on earth you'll load this and not crash and burn on 1.0v. Good ones can be undervolted to about 1.1-1.15 and still turbo to 4.3-4.4, but that's as good as it gets really.


----------



## revin (Aug 13, 2017)

I believe he is referring to the stock voltage here






As you can see It's default voltage and active voltage are the same I just added a .040mv Turbo voltage and up'd the power limits.
It's what has worked from the onset with this combo so I'm not going to complain.


----------



## Artas1984 (Oct 18, 2017)

This new stuff: Core i7 2600K vs. Core i7 8700K










The difference between i7 2600K vs. i7 8700K is supposed to be much bigger than the difference between Core i7 3770K vs. Core i7 6700K, but it is even smaller, despite different games tested. That Hardware Cannucks test just confirms that i've done my own test right! I mean, i compared core per core at the same MHz, just different architectures. I am shocked that Core i7 2600K, being 900 MHz slower at the top core and having 4 less threads in general put's up even a greater fight vs. Core i7 8700K in games! If not the multi-threaded tests of programs, in which Core i7 2600K gets demolished, i would not myself believe in those gaming results...

I am only pissed of that HC used ANTI-ALIASING in all games as well as "just" a GTX1070. I believe had they approached this with no AA and GTX1080 at least, the results would be fair. Sad that this does not exactly show the difference and fair sportsmanship like in my test.


----------



## peche (Oct 18, 2017)

this makes me happy i haven't upgrade my brave locked 3770 i7!




Regards.


----------



## gr33nbits (Oct 18, 2017)

Keep your 3rd and 4rd and more generations or just get a Ryzen.


----------



## peche (Oct 19, 2017)

gr33nbits said:


> or just get a Ryzen.


i rather myself to stay without PC.....


----------



## gr33nbits (Oct 19, 2017)

peche said:


> i rather myself to stay without PC.....



Yes yes get a console then oh wait they do have AMD inside too, guess you have to go Intel covfefe lake then they are great i heard go go go order a bunch for you and your freinds.


----------



## peche (Oct 19, 2017)

gr33nbits said:


> Yes yes get a console then oh wait they do have AMD inside too, guess you have to go Intel covfefe lake then they are great i heard go go go order a bunch for you and your freinds.


i just dont like  or care amd products as many people here in this forum, that's why my sarcasm, if that offended you i might apologize... just keeping the topic civilized


----------



## Bo$$ (Oct 19, 2017)

Why upgrade? doesn't even seem worth it if you've got an overclocked cpu.

6 years later a £400 CPU+Mobo+Ram is the same price as just the CPU. Until it costs less than £400 for double the perfomance i don't think it's worth it.

Its a very expensive side grade


----------



## R-T-B (Oct 19, 2017)

peche said:


> i rather myself to stay without PC.....



Oh come on...  that's just silly (and horrible!)


----------



## gr33nbits (Oct 20, 2017)

peche said:


> i just dont like  or care amd products as many people here in this forum, that's why my sarcasm, if that offended you i might apologize... just keeping the topic civilized



No you don't offended me don't worry, to be honest this Ryzen is my 1st ever AMD cpu and i wasn't expecting this level of power and all that, really glad i did went Ryzen this time around. AMD products even if you don't care are really a must or imagine what Intel would do to consumers, go AMD go.


----------



## Tomgang (Oct 20, 2017)

This is also one of the reasons i stay on x58/i7 980x. Gaming performance vs. Price for new setup cant justified it, as long the x58 works properly.


----------



## EarthDog (Oct 20, 2017)

You should go 4k.


----------



## gr33nbits (Oct 20, 2017)

Tomgang said:


> This is also one of the reasons i stay on x58/i7 980x. Gaming performance vs. Price for new setup cant justified it, as long the x58 works properly.



+1


----------



## Tomgang (Oct 20, 2017)

EarthDog said:


> You should go 4k.



Whas that to me?



gr33nbits said:


> +1



Thanks pal


----------



## EarthDog (Oct 20, 2017)

Yes, it was to you.

Perhaps at that res it will feel alive!!


----------



## Tomgang (Oct 20, 2017)

EarthDog said:


> Yes, it was to you.
> 
> Perhaps at that res it will feel alive!!



al right. I am al ready going 4K with Nvidia DSR and even with only DSR the difference between 1080P and 4K is deffenly noticable as in it looks great (going with 1080 TI on X58 turned out to be a great idea and at 4K its the GPU that struggles to keep up), but yeah i know glas ceiling .  

until i can afford a 4K screen or at least a 2560 x 1600 screen i must use DSR. Right now that is not possible. Some bloody idiot hit my car an then took of with out any note.
So the repair bill is on my self.


----------



## EarthDog (Oct 20, 2017)

When it rains, it pours...  

Sorry to hear that man.


----------



## Tomgang (Oct 20, 2017)

EarthDog said:


> When it rains, it pours...
> 
> Sorry to hear that man.



Yeah it suck balls and i am mad about it 

Some people shut never drive or even get a driving licens. He where properly drunk or some thing.


----------



## ZenZimZaliben (Oct 20, 2017)

I just switched from a i7 2700k running 4.85 - 5.0Ghz depending on ambient to a 6700k @ 4.8ghz. The difference is very minor in raw frame rate but the 6700k does deliver a smoother more fluid gaming experience. I think a big part of that smoothness isn't the cpu but the switch from DDR3 2400 to DDR4 3200. Another nice to have is the motherboard and its peripherals. nVME is a huge upgrade compared to SSD. If I would have had to pay for this upgrade then it wouldn't be something I would have done or recommend to anyone.

Even my older system an i7 950 oc'd 4.5+Ghz was maybe 3-5% slower than my i7 2700k and that was only because I could OC the 2700k higher.


----------



## Tomgang (Oct 20, 2017)

ZenZimZaliben said:


> I just switched from a i7 2700k running 4.85 - 5.0Ghz depending on ambient to a 6700k @ 4.8ghz. The difference is very minor in raw frame rate but the 6700k does deliver a smoother more fluid gaming experience. I think a big part of that smoothness isn't the cpu but the switch from DDR3 2400 to DDR4 3200. Another nice to have is the motherboard and its peripherals. nVME is a huge upgrade compared to SSD. If I would have had to pay for this upgrade then it wouldn't be something I would have done.



I can confirm about M.2 SSD. After going from a sata SSD to an older M.2 SSD (Samsung 950 PRO) on my X58 system, that gave as well a great bump in how responsive my system is. I am never going back to sata HDD/SSD ever again to OS drive.

and yeah a I7 950 at those clocks is not to shabby either. Had a i7 920 at 4.3 GHz for al most 8 years before i swapped it out for my current I7 980X + they oc great, my I7 980X can do 4.7GHz+ but at a high voltage. Great old CPU´s


----------



## ZenZimZaliben (Oct 20, 2017)

Tomgang said:


> I can confirm about M.2 SSD. After going from a sata SSD to an older M.2 SSD (Samsung 950 PRO) on my X58 system, that gave as well a great bump in how responsive my system is. I am never going back to sata HDD/SSD ever again to OS drive.
> 
> and yeah a I7 950 at those clocks is not to shabby either. Had a i7 920 at 4.3 GHz for al most 8 years before i swapped it out for my current I7 980X + they oc great, my I7 980X can do 4.7GHz+ but at a high voltage. Great old CPU´s



Nope, me either. It is so much cleaner looking, no wires, and the speeds are just insane. Boot times and Load screens have been drastically reduced. Yeah the ONLY reason I switch from x58 was because I was given a 2700k.


----------



## Tomgang (Oct 20, 2017)

ZenZimZaliben said:


> Nope, me either. It is so much cleaner looking, no wires, and the speeds are just insane. Boot times and Load screens have been drastically reduced. Yeah the ONLY reason I switch from x58 was because I was given a 2700k.



Totally better. Much cleaner, no stupid power/sata cables and as you say, so much better speeds with out the risk and trouble about setting up raid. Its a blast that ones pc is throw bios post its almost ready for use immediately after boot.
Just a shame i cant get max speed out of my M.2 SSD because PCIe 2 limits its full speed potential but with a X4 PCIe to M.2 adaptor i still get read speeds aroung 1500-1700 MB/s and ride speeds at 950 MB/s. Try doing that with a single sata SSD.

Here is a screenshot of the speed i get on daily bases from my M.2 SSD. max read speed differ depending on how full my drive is, but still far better than sata SSD can do alone.






I also made a boot video from then i just got the M.2 SSD. If you want to compare boot on my old junk to yours much newer pc. But i have no complains when it comes to boot time for hardware that came out about 8 years ago. Spool to 0:40 if you want to skip to boot up.


----------



## Artas1984 (Oct 20, 2017)

We all would appreciate if someone could do a 4K gaming benchmark: Core i7 Nehalem 6 core CPU clocked at least 4.5 GHz VS. Core i7 7700K VS. Core i7 8700K with GTX1080 Ti... Actually GTX1080 Ti would be the limiting factor.


----------



## EarthDog (Oct 21, 2017)

Artas1984 said:


> Actually GTX1080 Ti would be the limiting factor.


Im certain the x58 system will still be slower...even if you manage to find one at 4.5 ghz...


----------



## Artas1984 (Oct 21, 2017)

EarthDog said:


> Im certain the x58 system will still be slower...even if you manage to find one at 4.5 ghz...



I am certain of that too. All the chips of that generation have a roof of working with only 1066 MHz DDR3. Anyway, the HC test, as i said, was not done properly with GTX1070 being the limiting factor and with AA used.

Digital Foundry gaming tests show what a big difference Core i7 8700K can provide vs. Core i7 7700K, since they use GTX1080 Ti and use no AA, which shows that Core i7 2600K would be totally demolished by Core i7 8700K. I was mostly shocked by Crysis 3 results - Core i7 8700K opens such a huge gap that you really begin to understand that Crysis 3 was so ahead of it's time in terms of technician prowess. In Crysis 3 Core i7 7700K beats Ryzen 7 1800X, so i thought that we have reached the performance roof of that Crytek engine, yet Core i7 8700K just wipes the floor with Core i7 7700K and even beats the 8 core Core i7 7820X! This is the first time in history Crysis 3 can be played at max settings/1080p at constant 144 Hz! Ye, only after 4 years. Just like only after 4 years you could finally max out the original Crysis with HD7970.


----------



## Artas1984 (Nov 7, 2017)

Hardware Cannucks redid the test - using GTX1080 Ti instead of GTX1070. They still FAIL by using custom anti-aliasing in games to show the difference between the processors, however... When will they learn!!!???










This is how average FPS increase between Core i7 2600K and Core i7 8700K by switching the cards.

*Battlefield 1*: 3 % to 10 %
*Call of Duty*: 3 % to 1 %
*Deus Ex*: 1 % to 15 %
*Doom*: no change
*Wildlands*: 2 % to 7 %
*GTA 5*: 27 % to 65 %
*Overwatch*: 1 % to 4 %
*Witcher 3*: 5 % to 10 %

I believe they messed up GTA5 results. There is no way that adding GTX1080 Ti instead of GTX1070 gives only additional 2 FPS.. BS.


----------



## CAPSLOCKSTUCK (Nov 7, 2017)

Artas1984 said:


> We all would appreciate if someone could do a 4K gaming benchmark: Core i7 Nehalem 6 core CPU clocked at least 4.5 GHz VS. Core i7 7700K VS. Core i7 8700K with GTX1080 Ti... Actually GTX1080 Ti would be the limiting factor.




I can only think of one person................ @Knoxx29


----------



## FireFox (Nov 9, 2017)

CAPSLOCKSTUCK said:


> I can only think of one person................ @Knoxx29



But but i don' have an core i7 Nehalem, 8700K  and a 1080 ti 



EarthDog said:


> even if you manage to find one at 4.5 ghz...



even more than 4.5GHz


----------

