• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Fastest graphics card for i7-6700

980 Ti/1070 Ti/2070S

You need a faster CPU if you want anything better.
 
Wow. The 2070k seems a little extreme. But at higher rez and a i7-6700k that you could OC, it might make sense to go with that card. I'm stuck with the non-K chip and no OC. Thanks.

See if your bios has a multi core enhanced setting, allowing all 4 cores to hit the single core max turbo.
 
I'm darn near at the top of the food chain for my board now with the 6700. If I get a 6700K, 7700, or 7700K, I don't think I'd gain enough to make it worth while. And my board doesn't OC. (The board/CPU was originally in a steel industry PC; a hand-me-down)
Youre right putting anymore money into that socket is not a good idea. Why not just max out your card then plan for cpu upgrade in year or two?
 
Why not just max out your card then plan for cpu upgrade in year or two?
That's exactly what I was going to say. Just buy the best GPU in your budget and go from there. Your unlikely to keep this platform long term anyways (plus if your running a socket this old it's unlikely your targeting a highend GPU anyways) so I wouldn't even worry about it.
 
As the title suggests, I'd like to ask opinions on what would be the best card to pair with the oldish i7-6700 for QHD (1440p) gaming. CPU being a bottleneck to the card a bit is acceptable because I know every game is different. One game may be CPU limited, the next GPU limited. Thank you.
an RTX 3090Ti is the fastest


you answered your own question already - in CPU Limited games, you may have lower FPS - but in games that are GPU Limited, then there is no limit.

I have an i7 6700 here, and if i wasnt on custom water i'd love to throw the 3090 in there for some basic benchmarks - they've aged well.
One thing that helps with higher end GPU's is to make sure your RAM is set up correctly, and as fast as possible - it helps feed the higher end GPU's
 
Realistically for this CPU I would say in the range of 1080 Ti / 2080 Super maximum. 2080 Ti / 3070 would be too much for this, unless you’re strictly 4K gamer, but even then it’s not perfect. 4K doesn’t absolve CPU bottlenecks if the CPU is older and weaker. I would pair it with a 3060 or 3060 Ti maximum, 6600/6700 XT in the Red camp.
 
Wow. The 2070k seems a little extreme.
Not really. I had a 2080 paired with a Xeon W3680(effectively an i7-980X) in a Dell T3500. There was some CPU bottlenecking going on, but not as much as you might think. For a 6700, you could go with an RTX3070 or RX6800 and you'd be good.
 
I'd go 2060 or 5600XT and skip the GTX 1000 series. Maybe a 6600 or a RTX 3060 would do as well. Whichever comes cheaper.
 
My 2 year old RTX 2080 SUPER works very well with my aged 2700K if that helps?
 
RX 590, RTX 3050? Jesus Christ. People get a grip, this isn't Core 2 Quad or some random nearly useless e-waste, this is basically i3 10100F, but a bit older. It will take much faster cards perfectly fine and is still okay even for 4K gaming at 60 fps. It's still perfectly decent CPU.

I think you underestimate the RTX 3050 and/or overestimate what a 7-year-old locked quad-core can do. Even if this was an overclocked i7-6700K, which it isn't... you'd probably lose frames with an RTX 3060 on it in many games, even if your system had carefully tweaked memory. This processor is worse than a Core i3-10100F in practically every regard.

There's also the fact that OP is using a low-end 500-watt power supply and has no intention of replacing it, which means that your options become limited as power requirements increase. A 3050 is a plug-and-play solution that works there, and the RX 590 will also live with that power budget and OP has the option to buy a non-mined off a friend for a nice price, as well, making it a legitimate/valid option in this case.

With so few cores at limited frequencies and TDP and the aging architecture, the matter of it being unlocked and overclocked can easily make or break the day. TPU's i3-10100 review done with a 2080 Ti shows a fairly sizable gulf in performance in many games, though adequate, it leaves a lot of GPU power on the table as well:


And that, of course, considers Comet Lake has pretty much most if not all of the Skylake family's security bugs fixed in hardware without the infamous slowdown problems that you'll run into with an aging 6th gen processor.
 
Last edited:
I think you underestimate the RTX 3050 and/or overestimate what a 7-year-old locked quad-core can do.
I'm neither. The Xeon W3680 mentioned about is a 11year old 6core that STILL hold it's own today. Admittedly, it is getting dated. However a 6700 is still just fine for modern gaming. It's you that underestimates what older CPU's can do.
 
I'm neither. The Xeon W3680 mentioned about is a 11year old 6core that STILL hold it's own today. Admittedly, it is getting dated. However a 6700 is still just fine for modern gaming. It's you that underestimates what older CPU's can do.

I never said they weren't usable, just relatively dated and thus "unworthy" of being used with modern GPUs. I firmly believe that you'll get a much better experience by using a modern processor in conjunction with a GPU that's at or above the GTX 1080's level (which is ~RTX 3060/RX 6600 in modern equivalent). I suggested the RTX 3050 due to the 500W Antec power supply that OP mentioned and seems to have no intention of replacing. It also honestly doesn't look like that i7-6700 is installed on a Z170 board with unlocked, tweaked memory either.

I've used my i7-990X to some capacity until 2018 or so. You can still run games on them, yes, if they boot at all (due to lack of AVX instruction set support on 1st gen processors). But you'll quickly see how badly these processors aged once you put these same graphics cards on anything new and properly set up. Pre-Sandy Bridge makes one's life less than great nowadays.
 
I never said they weren't usable, just relatively dated and thus "unworthy" of being used with modern GPUs.
That's the point, they are worthy of modern GPU's.
I firmly believe that you'll get a much better experience by using a modern processor in conjunction with a GPU that's at or above the GTX 1080's level (which is ~RTX 3060/RX 6600 in modern equivalent).
Some people can't afford both a new GPU as well as a new CPU/Motherboard/RAM. They can afford one or the other. They need an objective perspective.
Here's one:
 
that was my cpu, paired very well with a rx5700, something similar would be my recommendation, more than that may be wasted money
 
That's the point, they are worthy of modern GPU's.

Some people can't afford both a new GPU as well as a new CPU/Motherboard/RAM. They can afford one or the other. They need an objective perspective.
Here's one:

I mean, that video proves my point precisely, the X5675 is coming in last in every benchmark he's run and percentually, losses are approaching 40%... i'll agree that the frame rates you'll get are "fine" (disregarding the rest), but you're wasting the GPU. At the end of the video he straight up says that using the 3090 with that processor knocks it down to the RTX 2080 Super's level, and also approached the same hypothetical I mentioned earlier: some games won't run at all because of AVX not being supported.

OP simply wants a practical upgrade that will be as light on the wallet as possible and not leave much on the table, so I think i'll stand by my recommendation of the 3050. :)
 
i should say a bit more about this, it also depends on what games you play, refresh rate monitor and resolution
It is not the same if you play single player visual safary games on 4K or play CSGO, valorant, pubg or whatever is your poison on a high refresh rate 1080p monitor.
Picking a gpu should also have that into account.
 
Any card could experience stuttering with this CPU, not worth the risk.

Speaking of Xeons, E-2176M 6/12 BGA to LGA 1151 mod can run on some motherboards just fine.
 
the best card to pair with the oldish i7-6700 for QHD (1440p) gaming ... I would like to stay at $300
In general, you're going to be GPU bottlenecked @ 1440p in modern games. Conversely, the CPU is your limit if you want the highest fps in multiplayer titles. And in older and/or badly optimized dx11 games you'll be held back by the ST performance of your i7-6700.

As has been suggested, I'd buy the fastest video card you can afford at the moment. When you upgrade your PC in the future, the new GPU can be easily transferred. Currently the RX6600/XT appear to have the best price/perf ratio. In Europe they start at EUR 225 and EUR 260, respectively, excluding local tax.
 
To be honest, my overclocked 4790k at 4.6ghz was holding back my 1080ti. If that doesn't give an idea for OP then I don't know what will.
 
people underestimate 1440p a lot.
you could throw a 3090 Ti in that PC and still have almost all games max out the GPU.
Here is a 4790K (overclocked) with a 3090 at 1080p. (1440p would be at least be 33% more load)
 
I'm darn near at the top of the food chain for my board now with the 6700. If I get a 6700K, 7700, or 7700K, I don't think I'd gain enough to make it worth while. And my board doesn't OC. (The board/CPU was originally in a steel industry PC; a hand-me-down)

If you can't overclock that sucks. Limits what you can do.

7700k OC to near 5 would work with a lot of cards without significant bottleneck as long as 1440p max details+
 
people underestimate 1440p a lot.
you could throw a 3090 Ti in that PC and still have almost all games max out the GPU.
Here is a 4790K (overclocked) with a 3090 at 1080p. (1440p would be at least be 33% more load)

what?! you do not need a 3090ti for 1440p that's insane, what are you doing playing CP77 on ultra? you don't even need a 3080.

"More power more better" but that's insane.
 
people underestimate 1440p a lot.
you could throw a 3090 Ti in that PC and still have almost all games max out the GPU.
Here is a 4790K (overclocked) with a 3090 at 1080p. (1440p would be at least be 33% more load)

From experience as someone who's owned a 3090 since launch day, Ampere scales very poorly to lower resolutions, and depending on the game, that's 1440p included. For 1080p you're looking at barely the same performance that an RX 6800 is going to achieve, sometimes less. Frame rates will not be anywhere near as nice as they should be. This isn't a CPU bottleneck, it's an internal architecture issue with super-wide GPUs like this, you've seen similar scalability issues before in Radeon Fiji and Vega architectures.

The RDNA2 architecture works differently, it performs better at 1080p and loses gas as you crank the resolutions up. The impact isn't anywhere near as bad on Ampere, actually, you'll find the odd situation that sometimes increasing the resolution *helps* the overall experience with an RTX 3090. I feel like these videos of "get old CPU from year X" and tack on a 3090 in it are rather meaningless because of that.
 
For $300, I would say a 2060 super.

I am running a 3060 and 8600K, I think the 8600K is the limiting point in my system presently, for most games I run 1440P and medium settings to try for 100+ FPS. I only mention this because the 8600K and the 6700K are similar performers.
 
Back
Top