# Will RTX 2070 work with my i5 6600 processor?



## Magicdragon (Nov 10, 2019)

Hello I've ordered an RTX 2070 and I'm wondering if it will work properly with i5 6600 processor, I've heard that hardware can bottleneck and I'm curious to see if this will happen and what affects this has on gaming, I'm quite new to PC's so any advice would be helpful.


----------



## INSTG8R (Nov 11, 2019)

Fine...don’t fall for “bottleneck hype”


----------



## Lionheart (Nov 11, 2019)

It will work fine/okay, some bottlenecks will occur in some of the more current titles though, I'd recommend a GTX 1660 Super or an RTX 2060 non Super TBH.


----------



## Jetster (Nov 11, 2019)

Yes of course, not even close to a bottleneck


----------



## Nuckles56 (Nov 11, 2019)

Well I had an i5 6500 paired with a 1080ti and I picked up a hell of a lot of performance with the 3700x in quite a few games, so the CPU will hold you back in many cases.


----------



## ginokiptoli (Nov 26, 2019)

Magicdragon said:


> Hello I've ordered an RTX 2070 and I'm wondering if it will work properly with i5 6600 processor, I've heard that hardware can bottleneck and I'm curious to see if this will happen and what affects this has on gaming, I'm quite new to PC's so any advice would be helpful.


I'd recommend a GTX 1660 Super or an RTX 2060 non Super TBH.


----------



## mrthanhnguyen (Nov 26, 2019)

I wanted to upgraded my gtx970 to 1070 but end up with 2060 super due to good price. Then it bottleneck my 3770k in BF5, so I have to upgraded the entire system.


----------



## GlacierNine (Nov 26, 2019)

Simple answer: Yes. 

Slightly more complex answer: Yes, although in some titles you might see less of a performance uplift than you would if you were making the same change to a system with a stronger CPU. 

There's no real reason not to do this upgrade as long as you're being realistic about what you'll get out of it. There are plenty of titles where this is going to give you a *big* boost in performance. There are some where you're going to be CPU limited and it won't make a lot of difference except in how high you can push some of the GPU-specific settings.


----------



## Papahyooie (Nov 26, 2019)

Jetster said:


> Yes of course, not even close to a bottleneck



Not necessarily... Depends on the use case. For gaming at high FPS (144hz plus) it is going to bottleneck in games that are highly threaded. Overwatch is a good example. Any quad core without hyperthreading (until at least 7th gen i5) won't be able to hold over 120fps. I had a whole thread about it some time ago. I could have upgraded to a GTX2080 and still wouldn't have gotten over 120fps, but an upgrade to a Ryzen 5 made my 980Ti get 180+.

@OP: Yea, it's going to bottleneck in some games, IF you are looking for high refresh rates. If you game at 60 fps, you're good. If you're trying to do high refresh rates, you're going to need more CPU grunt.

EDIT: (and honestly, if you're not doing high refresh rate, why buy a 2070 at all? Unless you're doing super high resolution, 4k, etc. Then that's fine.)


----------



## bug (Nov 26, 2019)

Magicdragon said:


> Hello I've ordered an RTX 2070 and I'm wondering if it will work properly with i5 6600 processor, I've heard that hardware can bottleneck and I'm curious to see if this will happen and what affects this has on gaming, I'm quite new to PC's so any advice would be helpful.


It depends what resolution you're using. At 4k or QHD the GPU is the bottleneck. At FHD the CPU may hold back the GPU a little, but not enough to worry about it.


----------



## Jetster (Nov 26, 2019)

Papahyooie said:


> Not necessarily... Depends on the use case. For gaming at high FPS (144hz plus) it is going to bottleneck in games that are highly threaded. Overwatch is a good example. Any quad core without hyperthreading (until at least 7th gen i5) won't be able to hold over 120fps. I had a whole thread about it some time ago. I could have upgraded to a GTX2080 and still wouldn't have gotten over 120fps, but an upgrade to a Ryzen 5 made my 980Ti get 180+.
> 
> @OP: Yea, it's going to bottleneck in some games, IF you are looking for high refresh rates. If you game at 60 fps, you're good. If you're trying to do high refresh rates, you're going to need more CPU grunt.
> 
> EDIT: (and honestly, if you're not doing high refresh rate, why buy a 2070 at all? Unless you're doing super high resolution, 4k, etc. Then that's fine.)



Our definition of Bottleneck is wildly different A 10% hit is not a bottleneck in my opinion. Notice his question. "Will it work"  the answer is yes


----------



## lexluthermiester (Nov 26, 2019)

Magicdragon said:


> Hello I've ordered an RTX 2070 and I'm wondering if it will work properly with i5 6600 processor, I've heard that hardware can bottleneck and I'm curious to see if this will happen and what affects this has on gaming, I'm quite new to PC's so any advice would be helpful.


You have no bottleneck issues to worry about. I have an older Xeon X5680 paired with an RTX 2080 and only experience CPU bottlenecking with certain titles.

You'll be fine with your 2070 and an i5-6600! Enjoy!



Papahyooie said:


> @OP: Yea, it's going to bottleneck in some games, IF you are looking for high refresh rates.


Nonsense! They'll be just fine.


----------



## jormungand (Nov 26, 2019)

Personal case here and i even made the thread.
I7 7700k  @ 4.5 ghz and @4.8ghz- 16gb 3000 - rtx 2070  [ SUPER ]
1440p 144hz
BATTLEFIELD V BOTTLENECK

Games like AC odyssey, witcher 3 and overwatch no problem.
I still wondering wth is going on in bfv.
6600 and the rtx 2070 should be ok....

question??? what are you planning to play.... ex MP titles/MOBA etc


----------



## ppn (Nov 26, 2019)

2500K @ 4.0 bottlenecks many games to 60% load of the 2070 not allowing to display more than 40-60 frames regardless of the resolution and texture detail. I think 6600 @ 3.6 is pretty much the same. I am fine with 60, but the 40 drops are ruining the experience for me. needs 6 core at least. or 4 core hyperthreaded @ 5.0 or 8 core at 3.0, anything but simple quad.


----------



## lexluthermiester (Nov 26, 2019)

ppn said:


> 2500K @ 4.0 bottlenecks many games to 60% load of the 2070 not allowing to display more than 40-60 frames regardless of the resolution and detail.


If that is happening, you're misconfiguring your setup or something else is holding you back.


----------



## hat (Nov 26, 2019)

lexluthermiester said:


> Nonsense! They'll be just fine.



Do note he said, IF you're looking for high refresh rates, which he defined as 144FPS (a common refresh rate for current high refresh monitors). You're not getting that with an i5-6600 in current titles.



lexluthermiester said:


> If that is happening, you're misconfiguring your setup or something else is holding you back.



Sounds accurate to me for a 2500k, depending on the game, of course. 4.0 isn't quite fast on those old Sandy cores.


----------



## lexluthermiester (Nov 26, 2019)

hat said:


> Do note he said, IF you're looking for high refresh rates, which he defined as 144FPS (a common refresh rate for current high refresh monitors). You're not getting that with an i5-6600 in current titles.


Ah but a default resolution was not disclosed. Most people are gaming at 1080P, and at that res, they should get close to(if not beyond) 144fps with a i5-6600 and a 2070.  I have a friend that had an i5-3570 with his 2070S(before we upgraded him to a Ryzen9) and was getting between 80 to 90FPS in most titles at 1440P with AA locked at 16x in the driver. When I showed him the difference between his settings and AA off, he decided to shut off the AA. Then we dropped to 1080P and the FPS shot through the roof.



hat said:


> Sounds accurate to me for a 2500k, depending on the game, of course. 4.0 isn't quite fast on those old Sandy cores.


Agreed, but 100+FPS can be done with tweaking.

@Magicdragon 
If you really want to get the most out of your system, tailor your GPU settings to your liking. Turn down(or off) AntiAliasing as it is a very resource intensive setting for both the GPU and CPU. Also, your 2070 has 8GB of ram onboard and your system will run best if you have 12GB or 16GB of system for most AAA titles that are RAM intensive.


----------



## EarthDog (Nov 27, 2019)

So... your upgrade will be worth it, OP. If you play at 1080p, that cpu wont hold things back much. Only with games that can use more than the 4 available cores and threads will things be noticeable... which is a few/several titles now with more as time goes on. Higher clocks also help.

Make no mistake about it though, gents, at 1080p a 2500/2600k, even at 5 ghz, puts a glass ceiling on most titles. So does an ancient xeon. Just because it can reach 60fps or 144 doesn't mean there isn't a bottleneck and additional performance cannot be had using a faster processor as well. It just means the bottleneck doesn't affect them for their uses... it is still there. When a tree still falls in the forest and nobody is there to hear it, it still makes a sound. 

Heres a good read where you can see the difference. Granted, this is with a 2080ti, but a 2070 is still a powerful card that can use higher clocks at the lower res.


			Redirect Notice
		


Edit: OP is a post and run... hasn't been back since a day after he posted (and never responded..).


----------



## Toothless (Nov 27, 2019)

OP probably ain't coming back. Last seen Nov 11th.


----------



## lexluthermiester (Nov 27, 2019)

Toothless said:


> OP probably ain't coming back. Last seen Nov 11th.


They might be lurking and taking in the info.


----------



## r9 (Nov 27, 2019)

INSTG8R said:


> Fine...don’t fall for “bottleneck hype”


+1

Bump the graphics settings up and the bottleneck problem is solved.


----------



## Papahyooie (Nov 29, 2019)

All you "there is no bottleneck" types, I'm sorry, but you're wrong. You simply don't fit the use case where it will bottleneck. 

As I said.. if you game at high resolutions and/or lower frame rate, it will do just fine. However, it will not be able to hit 144hz in many games, no matter what graphics card you buy, even a 2080. That is the definition of a bottleneck. Doesn't matter if it's 1% or 99%. In my case, I was bottlenecked to the tune of about 20% in a similar situation. Couldn't hold over 120fps minimums no matter how low I set the graphics. Upgraded processor, bam, problem solved. Now I'm doing 144 locked. That's a bottleneck. The fact that you don't notice it does absolutely affect your decision on what to do, but it doesn't affect the fact that the bottleneck is still there, so don't give some "you can't see 144hz anyway" bullcrap. 

Bottom line is, IF the OP wants to game at high refresh rate, he will not be able to do so in some titles, because it is a bottleneck. That's not really up for debate, it's a provable fact. What he does with that information is purely up to him. If it doesn't matter, and he doesn't plan on playing at high refresh rates, then roll on with it. But don't attempt to mislead him by saying there is no bottleneck. There most certainly is.


----------



## bug (Nov 29, 2019)

Papahyooie said:


> All you "there is no bottleneck" types, I'm sorry, but you're wrong. You simply don't fit the use case where it will bottleneck.
> 
> As I said.. if you game at high resolutions and/or lower frame rate, it will do just fine. However, it will not be able to hit 144hz in many games, no matter what graphics card you buy, even a 2080. That is the definition of a bottleneck. Doesn't matter if it's 1% or 99%. In my case, I was bottlenecked to the tune of about 20% in a similar situation. Couldn't hold over 120fps minimums no matter how low I set the graphics. Upgraded processor, bam, problem solved. Now I'm doing 144 locked. That's a bottleneck. The fact that you don't notice it does absolutely affect your decision on what to do, but it doesn't affect the fact that the bottleneck is still there, so don't give some "you can't see 144hz anyway" bullcrap.
> 
> Bottom line is, IF the OP wants to game at high refresh rate, he will not be able to do so in some titles, because it is a bottleneck. That's not really up for debate, it's a provable fact. What he does with that information is purely up to him. If it doesn't matter, and he doesn't plan on playing at high refresh rates, then roll on with it. But don't attempt to mislead him by saying there is no bottleneck. There most certainly is.


I'm sorry, but it's you who is wrong here.
A bottleneck happens when one component holds another one back (it cannot feed it fast enough). The scenario that you keep referring to is not a bottleneck: today's GPUs simply cannot deliver 144fps in most tiles, period. No CPU bottleneck involved.


----------



## P4-630 (Nov 29, 2019)

I had a i5 6500 previously and it was bottlenecking my GTX1070, for example GTA V, I barely could get to 80fps @1440p.
I've upgraded my CPU to a i7 6700K and got much better results, +- 100fps with GTX1070 @1440p.
Now I've got a RTX2070 Super with i7 6700K and now again the CPU is the bottleneck for example in GTA V, WD2.


----------



## ShurikN (Nov 29, 2019)

To sum it up, in older titles it will work fine. In newer titles it will perform like shit.
And also note the video was done with a 7600K overclocked to 4.8GHz as opposed to that stock 6600 running a whole GHz slower.
It all depends on what you plan to play and at what resolution... If you want to futureproof the system for at least a couple of years, get rid of the 6600.


----------



## Papahyooie (Nov 29, 2019)

bug said:


> I'm sorry, but it's you who is wrong here.
> A bottleneck happens when one component holds another one back (it cannot feed it fast enough). The scenario that you keep referring to is not a bottleneck: today's GPUs simply cannot deliver 144fps in most tiles, period. No CPU bottleneck involved.



I play all of my games at greater than 144fps every day. I don't want to make this personal, but you don't know what you're talking about. 

Can today's GPUs get 144hz in most titles at 4k? No.
Can they at 1080p? Abso-freaking-lutely. 

So again... you simply are not accounting for the situations where a bottleneck can occur. You are wrong. Plain and simple. And as such, you are misguiding the OP.


----------



## Vayra86 (Nov 29, 2019)

Magicdragon said:


> Hello I've ordered an RTX 2070 and I'm wondering if it will work properly with i5 6600 processor, I've heard that hardware can bottleneck and I'm curious to see if this will happen and what affects this has on gaming, I'm quite new to PC's so any advice would be helpful.



Yes. Anything over GTX 1070 performance means you lose some of it due to lacking CPU grunt, broadly speaking. 4C/4T also does not suffice for stable frametimes, in other words, games can stutter.

Whether this is a _problem_ comes down to your desired performance /FPS target per game; and how much you like actually using what you've paid for. A rig that is well balanced between CPU/GPU/RAM/Storage is more cost effective than one with a fat GPU but lacking CPU. The upside to lacking balance is that you can easily move a faster GPU to a new rig. No need to upgrade all at once.


----------



## lexluthermiester (Nov 30, 2019)

Vayra86 said:


> 4C/4T also does not suffice for stable frame times, in other words, games can stutter.


Depends greatly on the game and quality/driver settings selected. However, an i5-6600 is not an ancient CPU. If my X5680(which is two architectures older) can keep a 2080 fed with data at 120hz, an i5-6600 will work fine with a 2070. The OP has already been given good advice about recommended settings.


----------



## ppn (Nov 30, 2019)

4/4 means 66% gpu load in many titles. yes 6/6 will probably keep it fed. 120 fps is doable only in esports I guess. there will be a limit, for example with 4/4 drops to 40 fps in some places and i could improve it to 50 fps 1080p medium settings stuck on 40% gpu load, to get 99% load had to run on 4K.


----------



## Vayra86 (Nov 30, 2019)

lexluthermiester said:


> Depends greatly on the game and quality/driver settings selected. However, an i5-6600 is not an ancient CPU. If my X5680(which is two architectures older) can keep a 2080 fed with data at 120hz, an i5-6600 will work fine with a 2070. The OP has already been given good advice about recommended settings.



Does not depend on driver or quality settings at all but on engine and game logic you can never get around. In the vast majority of games and especially in newer titles since 2016-2017 anything that can, will be offloaded to GPU The things that can't, tend to struggle when there are only 4 threads available at any one time. This is not new. It should not surprise you anymore nor should it be a discussion.

A 6600 is not only a bottleneck in the sense that a 2070 can push way higher frames with a faster CPU in the exact same setting, but it also lacks the threads when the CPU load gets higher and especially in regular use case system that has multiple other background processes besides the game going on. Again, this isn't news and its time to accept that quads are well below optimal for ANY gaming rig.

Yes, any. Even with midranged GPUs you can and will feel the impact of lacking thread counts. Its not rocket science; consoles have been using more threads for quite some time now, and like VRAM demands, thread count demands have also risen to console levels; 6GB is recommended nowadays, and 6 threads are no luxury. Will it run with less? Yes. But not *smoothly* and it is irrelevant it works 'in many games', the moments you get annoyed is when it won't work well in a few of them with GPU grunt to spare. And it gets worse, not better, over time.

Your X5680 has a different 6c12t thread count and is not comparable in the slightest. You have literally* 3x as many threads* to get work done. I speak from the experience of a quadcore Intel user in 2016. I can show you a bench to underline this (been done quite a few times already, mind, but you must have missed it)

Sub 60 fps. GTX 1080. 3570k @ 4.3 quad versus 8700k 6c12t, below. Tell me again there isn't a bottleneck  Note the thread load as well; the 8700K has _five_ major loads on a core (I omit the HT 'cores' in RTSS), not four. Take special note of the two deep caverns in bench number one - they vanished in number two. Note also that _despite sub 100% loads per core, _FPS is still heavily limited.

Besides, about 'generations', the only meaningful bonus the 6600 has, is DDR4. Architecturally there isn't much since Haswell to matter for a tangible gap in IPC. And higher clocks will never compensate for a lack of thread count when those threads are ever needed concurrently. Which is what is happening today.


----------



## lexluthermiester (Nov 30, 2019)

Vayra86 said:


> Does not depend on driver or quality settings at all but on engine and game logic you can never get around.


Nonsense. While the game engine does factor in, an i5-6600 is not currently troubled greatly by many games. The rest of your comment is tailor-made to support your point, simply not objective. Additionally, your pictured examples don't match each other and don't identify the CPU's being tested. Dubious at best.


----------



## Vayra86 (Nov 30, 2019)

lexluthermiester said:


> Nonsense. While the game engine does factor in, an i5-6600 is not currently troubled greatly by many games. The rest of your comment is tailor-made to support your point, simply not objective. Additionally, your pictured examples don't match each other and don't identify the CPU's being tested. Dubious at best.



Hey, if you want to live in a 4 year old reality, suit yourself man. But you can pick up any bench elsewhere and see similar results.

These screens are my own, so you'll have to excuse me for not writing a full review around it. What you do have is the timestamp in the filename of these screens. Pre and post upgrade, jan 2018. The performance uplift however is being enjoyed daily whether you believe it or not 

This is probably also entirely not objective and tailor made








						CPU-Tests 2022: Benchmark-Bestenliste - Leistungsindex für Prozessoren [November]
					

CPUs 2022 in der Rangliste und Bestenliste: Benchmarks zu mittlerweile 40 Prozessoren von AMD und Intel.




					www.pcgameshardware.de
				








Or this - note the remark made below it 





Or this; note the GPU and how the gap between avg and min FPS fades beyond 4C/4T = stable frametimes.





I think we're done here.


----------



## lexluthermiester (Nov 30, 2019)

Vayra86 said:


> This is probably also entirely not objective


Of course it isn't. There's nothing identifying what it is, where it came from and what the test parameters are. Context is important and you're failing to provide any.
These were posted in the other thread, but they apply here as well.









































Those are actual benchmarks that have been run comparing the i5-6600(or close to it) to other CPU's and with specific GPU's. They provide proper context to reach an informed conclusion.

Conclusion: Yes, the i5-6600 is starting to age a little bit, but it is far from being useless for gaming.  It still provides good performance in most games and certainly enough to keep an RTX2070 on it's toes most of the time.


----------



## Vayra86 (Nov 30, 2019)

lexluthermiester said:


> Of course it isn't. There's nothing identifying what it is, where it came from and what the test parameters are. Context is important and you're failing to provide any.
> These were posted in the other thread, but they apply here as well.
> 
> 
> ...



Yeah, that looks absolutely amazing, you speak of realistic and verifiable and you bring a slew of questionable 'Tubers. So this... is what your preferred gaming looks like. Dirt Rally and RS6 Siege... really? They run on a toaster. And then came GTA...

Wooopsie!

Like I said, we're done. If you like your pop in, do take a quad.


----------



## lexluthermiester (Nov 30, 2019)

Vayra86 said:


> Yeah, that looks absolutely amazing, you speak of realistic and you bring a slew of questionable 'Tubers.


They're better than anything you've offered. Prove up with with something that has merit.


Vayra86 said:


>


You're nitpicking over GTA5 in-game glitches that happen to everyone from time to time, regardless of platform.


Vayra86 said:


> Like I said, we're done.


Yes, you are.


----------



## Vayra86 (Nov 30, 2019)

lexluthermiester said:


> They're better than anything you've offered. Prove up with with something that has merit.
> 
> You're nitpicking GTA5 in-game glitches that happen to everyone from time to time, regardless of platform.
> 
> Yes, you are.



100% load is a game glitch, righto! All I did was fast forward to the first moderately CPU heavy game. Its called knowing where to look.


----------



## cucker tarlson (Nov 30, 2019)

lol,of course it'll bottleneck a 2070.this cpu will struggle to hit 60 or even 50 fps in a lot of modern games.

you guys and your fancy charts.you can always find one that will suit you on Iamrightyourewrong.com

I played with a few 4c/4t and a few 4c/8t intels and a gtx 1080 needs a 4790k at least.

And fps is not the whole story,cpus tend to produce stutter when the usage is very high.


----------



## Bill_Bright (Nov 30, 2019)

bug said:


> A bottleneck happens when one component holds another one back (it cannot feed it fast enough).


Or it cannot accept/process what it is being fed fast enough.


----------



## Vayra86 (Nov 30, 2019)

cucker tarlson said:


> lol,of course it'll bottleneck a 2070.this cpu will struggle to hit 60 or even 50 fps in a lot of modern games.
> 
> you guys and your fancy charts.you can always find one that will suit you on Iamrightyourewrong.com
> 
> ...



No no, people with a broadwell 6c12t have better hands on experience than those actually using the quads  Isn't it obvious? 

Just another day with Lex   History repeats



Vayra86 said:


> 100% load is a game glitch, righto! All I did was fast forward to the first moderately CPU heavy game. Its called knowing where to look.



You can laugh as a 'tit for tat', but take some time to let it sink in. Your saying glitch, when the actual screenshot and the _entire sequence _is lacking assets and textures, while the exact same run on a CPU not at cap, does not. In a 2015 game, mind. On top of that, if this really was 'a rare glitch', wouldn't that warrant running the bench again? And if not, what does that say about your source?

We already knew that a large number of games are light on CPU and heavy on GPU by comparison; this has been the status quo since the PS3 onwards. But there are exceptions, and a large number of those are specifically games you'd buy a PC for. (Grand) strategy, simulations, city builders, and large, moddable, open world content. And each and all of them will suffer heavily. You say examples without merit (*sigh*) - I say these are the precise examples that count, because these are the unique selling points the PC gaming library has to offer.



lexluthermiester said:


> They're better than anything you've offered. Prove up with with something that has merit.



Merit? Check any of the other benches I linked... We aim to please. But, please do open your eyes then.

You're not entirely wrong. The i5 6600 is not useless, you can play many games on it, and well. But keeping a 2070 on its toes, it for sure cannot, and it will also never be a painless experience across a wide variety of games.


----------



## cucker tarlson (Nov 30, 2019)

Vayra86 said:


> No no, people with a broadwell 6c12t have better hands on experience than those actually using the quads  Isn't it obvious?


980ti/1070 - 3770k
1080-4790k
1080ti-7700k

end of story.I'm not even including 4 core cpus.a locked 6 core can experience problems with a 2070,let alone a locked 4 core.


----------



## bug (Nov 30, 2019)

Bill_Bright said:


> Or it cannot accept/process what it is being fed fast enough.


Yeah, it works the other way around, too.
Which basically means that unless all your components are perfectly matched, you have a bottleneck somewhere in your system anyway. Thus the real problem is not whether you have a bottleneck (you most certainly do), but if that bottleneck is hold you back significantly. And "significantly" means something else to each one of us.


----------



## Bill_Bright (Nov 30, 2019)

bug said:


> And "significantly" means something else to each one of us.


To me, that means if it is something I can "perceive". If it is only something that shows up on a benchmark tests, that is not significant.


----------



## dirtyferret (Nov 30, 2019)

lexluthermiester said:


> They might be lurking and taking in the info.


Or they simply wanted to start a flame war.  I've scene a lot of one question and done accounts in the forums lately like the OP


----------



## lexluthermiester (Dec 1, 2019)

Vayra86 said:


> Merit? Check any of the other benches I linked... We aim to please. But, please do open your eyes then.


Those were edited in quite a while after I posted my responses.


Vayra86 said:


> You're not entirely wrong.


I know full well I'm right, I've tested such setups first hand.


Vayra86 said:


> The i5 6600 is not useless, you can play many *99% of all games* games on it, and well.


Fixed that for you.


Vayra86 said:


> But keeping a 2070 on its toes, it for sure cannot


I have already shown information that contradicts your opinion. When you have actually tried such a combo and run the benchmarks yourself, then you can talk about whether or not it will. You have not, so you can not.


Vayra86 said:


> Just another day with Lex  History repeats


Are you done with the personal jabs? You've more or less admitted you know you lost this little debate. You always resort to insults when you have. You're done here. And before anyone says that I'm reacting out of pride/ego I would ask you to consider the following;
The OP came here asking a question and has been given much in the way of opinion. Hopefully, useful and helpful information has risen above the rest and has helped the OP understand the reality of their situation.

@Magicdragon
Your system as it is will be fine for a year or 2 and you will get an good experience. However, while that i5-6600 is a good CPU, it is starting to show it's age and it's time to start planning an upgrade. If/when you decide that's what you would like to do, please let us know here or start a new thread. There are those here that will be happy to offer you objective help in suggesting parts that will meet your needs for years to come.



dirtyferret said:


> Or they simply wanted to start a flame war.  I've scene a lot of one question and done accounts in the forums lately like the OP


That is possible. It has been going on. Still, I prefer not to make that assuption of the threads that seems to be asking for actual help. Some of them seem to be blatantly starting a fight, but this one just isn't giving off that vibe.


----------



## Vayra86 (Dec 1, 2019)

dirtyferret said:


> Or they simply wanted to start a flame war.  I've scene a lot of one question and done accounts in the forums lately like the OP



Success 



lexluthermiester said:


> Those were edited in quite a while after I posted my responses.
> 
> I know full well I'm right, I've tested such setups first hand.
> 
> ...



Like I said, 4 year old reality. Enjoy... The info is there, including those examples that completely counter your statements, which, as usual, don't fit your narrative so you resort to arguments about tone of voice  Like I said, history repeats, you don't even see it. You've still failed to respond to those, btw. And they immediately kill your 99% of games are OK statement. The only OK games are those that don't lean on CPU, which your youtube links are chock full of, and the odd ones out are 'a glitch' to you .

Its in-depth (me) versus blanket statements (you) about this setup. Up to readers what tickles them, indeed. But your high ground about knowing what's what is so misplaced, its funny. You barely game enough to know what is really happening and you're oblivious to it. Anyone who does, can spot this easily, and its a pattern with you. You're now on the multiquote spree again. Take a step back, and take the time to look at the examples given. No need for any further responses, just try to learn a thing or two. I'm not the only one saying it, either.


----------



## oxrufiioxo (Dec 1, 2019)

Next year we will see a thread with "will a 7600k bottleneck/work/hold back a RTX 3070" with a 1 and done post as well.


will a 6600k bottleneck a 2070 yes... will a 6600k work with a 2070 yes ( pretty sure it would work with a dual core from 2011 so what does that matter ).... will most games play just fine yes....


----------



## bug (Dec 1, 2019)

Bill_Bright said:


> To me, that means if it is something I can "perceive". If it is only something that shows up on a benchmark tests, that is not significant.


100% agree, but "perceive" also differs from one man to another. Not to mention the placebo effect.
Life would be so much simpler if we were all robots/cyborgs functioning using well defined protocols


----------



## Bill_Bright (Dec 1, 2019)

bug said:


> 100% agree, but "perceive" also differs from one man to another.


The "range" of perception among humans is actually pretty narrow. Reaction times, on the other hand, vary widely.


bug said:


> Not to mention the placebo effect.


Yeah, it is amazing how the mind can fabricate what we want to believe even when it is not there. I would like to think that over the years, at least with computers, networks and other electronics, I've trained my mind to be objective and to stick with the facts so I cannot be influenced by the placebo effect.


----------



## bug (Dec 1, 2019)

Bill_Bright said:


> The "range" of perception among humans is actually pretty narrow. Reaction times, on the other hand, vary widely.
> Yeah, it is amazing how the mind can fabricate what we want to believe even when it is not there. I would like to think that over the years, at least with computers, networks and other electronics, I've trained my mind to be objective and to stick with the facts so I cannot be influenced by the placebo effect.


I make no such assumptions. I either measure what I think I perceive or I simply don't care and just go with it


----------



## PLSG08 (Dec 1, 2019)

Most of the time I think it really depends on the game you usually play to really see if your procie would hold your frames back or not.

Back then I was on an i5 7400 + RX 480 and i was getting around an avg fps of 80 fps on R6:Siege with the game chocking when you open another window while playing. When I switched to my R5 2600 (Still with the RX 480 at the time) the problem disappeared and I got an average of 140+ fps. R6 is known to be really processor heavy so yea I did expect that to make the difference.

Most games WILL run fine on a 6600 + 2070, BUT if you want higher frames or be able to multitask while gaming, a good procie upgrade would do wonders. You don't even have to get the latest Ryzen or Intel i7 9th gen, a 6th/7th Gen i7 is still plenty of horsepower and you can find those around for second hand.

Even my lil bro is experiencing choppy FPS when discord is open while he's playing R6.


----------



## dirtyferret (Dec 1, 2019)

bug said:


> Not to mention the placebo effect.



Years ago I OC my phenom II x4 955 to 3.8 or 3.9ghz. The difference between stock and OC was night and day.  OS was snappier and games ran faster to the naked eye...few weeks later I went into the BIOS and realized I forgot to save my OC settings upon exiting...


----------



## lexluthermiester (Dec 2, 2019)

Vayra86 said:


> No need for any further responses, *just try to learn a thing or two.*


Please take your own advice.


Vayra86 said:


> I'm not the only one saying it, either.


And you're not the only one failing(as usual) to comprehend context which is, as always, important.


----------



## Papahyooie (Dec 2, 2019)

We can argue about what makes a bottleneck, or what makes a bottleneck significant all day long. 

Bottom line is, with a i5 6600 and a GTX 2070, you will not get any faster framerates than if you had an i5 6600 and a GTX 1070. Maybe even a GTX 1060, depending on the game. So you have completely wasted your money, as the 2070 sits idle waiting on the i5 to send it data. The bottleneck is there. If you keep the i5 6600, you might as well not even buy the 2070, because a slower graphics card will suffice. 

THAT is the operative thing that makes a bottleneck worth considering.


----------



## trog100 (Dec 2, 2019)

Papahyooie said:


> We can argue about what makes a bottleneck, or what makes a bottleneck significant all day long.
> 
> Bottom line is, with a i5 6600 and a GTX 2070, you will not get any faster framerates than if you had an i5 6600 and a GTX 1070. Maybe even a GTX 1060, depending on the game. So you have completely wasted your money, as the 2070 sits idle waiting on the i5 to send it data. The bottleneck is there. If you keep the i5 6600, you might as well not even buy the 2070, because a slower graphics card will suffice.
> 
> THAT is the operative thing that makes a bottleneck worth considering.



this entirely depends on the resolution being used.. the higher the frames rates the harder to cpu has to work to keep up.. at higher resolutions and lower frames rates a mid range cpu will do fine..

trog


----------



## EarthDog (Dec 2, 2019)

trog100 said:


> this entirely depends on the resolution being used.. the higher the frames rates the harder to cpu has to work to keep up.. at higher resolutions and lower frames rates a mid range cpu will do fine..
> 
> trog


This is true for the most part. However, if a game isn't getting the cores needed, it will still suffer regardless of resolution + GPU and still be a bottleneck/cause of poor performance/experience. You can take any CPU and any GPU and see this happen. 






People seem to be confused......... the GPU isn't using all the cores/threads, it is the GAME doing so. If a game only needs 2-3 threads, then it isn't holding the game back (if that makes sense). That doesn't mean it still cannot hold the fps back from the GPU. though. For example, ancient Xeons and Sandybridge for Intel and all AMD CPUs not named Ryzen 2/3 will hold titles back at 1080p and lower (where most users game according to steam). There is a difference there.


----------



## lexluthermiester (Dec 2, 2019)

Papahyooie said:


> Bottom line is, with a i5 6600 and a GTX 2070, you will not get any faster framerates than if you had an i5 6600 and a GTX 1070.


That is most definitely incorrect. It highly depends on the game being played and the settings being used. However, I have an older X5680 and went from a 1080 to 2080 and the difference was significant, dramatically in some cases. 2070 will most definitely improve performance on a i5-6600, as the OP will discover when they receive theirs and run their games.

However, the OP has not stated what they are upgrading from. If they're upgrading from a 1080 the impact will be lesser, though it will still be noticeable. If they're upgrading from something less than 1070 then the difference will be very dramatic.


EarthDog said:


> However, if a game isn't getting the cores needed, it will still suffer regardless of resolution and still be a bottleneck.


4 cores that can all-core turbo to 3.7ghz is just not going to suffer that greatly. The OP will be fine.


----------



## EarthDog (Dec 2, 2019)

lexluthermiester said:


> 4 cores that can all-core turbo to 3.7ghz is just not going to suffer that greatly. The OP will be fine.


No shit. I said that as my first post in this thread. I am simply clarifying information of this obtuse thread.


EDIT: I wonder how long staff is going to leave this thread open? OP hasn't been to the site since 11/11 and people are still going back and forth.........only at TPU!!!!!!!


----------



## bug (Dec 2, 2019)

Papahyooie said:


> We can argue about what makes a bottleneck, or what makes a bottleneck significant all day long.
> 
> Bottom line is,* with a i5 6600 and a GTX 2070, you will not get any faster framerates than if you had an i5 6600 and a GTX 1070*. Maybe even a GTX 1060, depending on the game. So you have completely wasted your money, as the 2070 sits idle waiting on the i5 to send it data. The bottleneck is there. If you keep the i5 6600, you might as well not even buy the 2070, because a slower graphics card will suffice.
> 
> THAT is the operative thing that makes a bottleneck worth considering.


Even if that was true, you can still enable RTX


----------



## Bill_Bright (Dec 2, 2019)

Papahyooie said:


> Bottom line is...


No! The bottom line is, there are WAY TOO MANY variables (in terms of hardware, software, the user's tasks, the user's preferences, the user's perceptions) for any one person to claim they know what the bottom line is.


----------



## lexluthermiester (Dec 2, 2019)

Bill_Bright said:


> No! The bottom line is, there are WAY TOO MANY variables (in terms of hardware, software, the user's tasks, the user's preferences, the user's perceptions) for any one person to claim they know what the bottom line is.


Excellent point really. We can generalize about probabilities, would you not agree?


----------



## Bill_Bright (Dec 2, 2019)

lexluthermiester said:


> We can generalize about probabilities, would you not agree?


As long as those generalizations are expressed as personal opinions and not fact. Your bottom line may be, and likely is totally different from mine - and that is likely true for everyone else. 

There is no "one size fits all".


----------

