• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Fastest graphics card for i7-6700

what?! you do not need a 3090ti for 1440p that's insane, what are you doing playing CP77 on ultra? you don't even need a 3080.

"More power more better" but that's insane.
never said that.
i just showed him that he even could buy a 3090 Ti for his 6700.
at 1440p it would be absolutely no problem in 9/10 games.
 
I think you underestimate the RTX 3050 and/or overestimate what a 7-year-old locked quad-core can do. Even if this was an overclocked i7-6700K, which it isn't... you'd probably lose frames with an RTX 3060 on it in many games, even if your system had carefully tweaked memory. This processor is worse than a Core i3-10100F in practically every regard.
As if i3 10100 couldn't run games at way over 60 fps.

There's also the fact that OP is using a low-end 500-watt power supply and has no intention of replacing it, which means that your options become limited as power requirements increase. A 3050 is a plug-and-play solution that works there, and the RX 590 will also live with that power budget and OP has the option to buy a non-mined off a friend for a nice price, as well, making it a legitimate/valid option in this case.
And your recommendation is RX 590? Is this a joke? Even RX 580 was known to guzzle watts like no tomorrow, as much as whole GTX 1080 did. RX 590 might be in 1080 Ti territory and it's still the worst Polaris card. Not to mention that it's old as fuck nowadays and doesn't support even rather old VP9 decoding, which YT uses. Also RTX 3050 is terrible value card. RX 6600 slays it at everything and is cheaper, buy it instead.

With so few cores at limited frequencies and TDP and the aging architecture, the matter of it being unlocked and overclocked can easily make or break the day. TPU's i3-10100 review done with a 2080 Ti shows a fairly sizable gulf in performance in many games, though adequate, it leaves a lot of GPU power on the table as well:

That i3 performs basically the same as 3900X. That's not a weak chip at all for gaming. Also 4C/8T CPUs are still perfectly enough for gaming, also Skylake was rebranded up until Comet Laike R, so 6700 is basically the same i3 10100, there are no architectural differences at all. Sure i3 10100 has slightly higher clocks (200 MHz), but i7 6700 as 2MB more cache, that still works out better for i7 and you can also crank PL value and basically always run it at maximum boost clocks if you want. Considering how well that i3 handled RTX 2080 Ti, i7 can easily handle it too. RTX 2080 Ti is basically the same as RTX 3070 and you might still need to go further to actually find a bottleneck from CPU side. I'm telling you that i7 is still great.

Here are the tests of that i3:

You need RTX 3080 to finally see small differences, that are still not explained by core or thread count, only by differences in cache and perhaps higher boost on i7 or i5. OP is still more limited by budget and power supply than by his i7. He basically needs RTX 3080 tier card to finally see some shortcomings of his CPU. Considering 500 watt PSU, the most I could recommend for him safely with some safety margin is RTX 3060 Ti, RTX 2080 or RX 6700 XT and of course the fastest out of them is 6700 XT, not to mention, it has most vRAM. How it fares value vise I have no idea as I have never looked at prices of those cards, but if he wants something cheaper there are some great options like RX 6600.


And that, of course, considers Comet Lake has pretty much most if not all of the Skylake family's security bugs fixed in hardware without the infamous slowdown problems that you'll run into with an aging 6th gen processor.
It's exactly same arch tho, only Comet Lake got shaved CPU die and heightened IHS for better thermals, but that's it.
 
As if i3 10100 couldn't run games at way over 60 fps.


And your recommendation is RX 590? Is this a joke? Even RX 580 was known to guzzle watts like no tomorrow, as much as whole GTX 1080 did. RX 590 might be in 1080 Ti territory and it's still the worst Polaris card. Not to mention that it's old as fuck nowadays and doesn't support even rather old VP9 decoding, which YT uses. Also RTX 3050 is terrible value card. RX 6600 slays it at everything and is cheaper, buy it instead.


That i3 performs basically the same as 3900X. That's not a weak chip at all for gaming. Also 4C/8T CPUs are still perfectly enough for gaming, also Skylake was rebranded up until Comet Laike R, so 6700 is basically the same i3 10100, there are no architectural differences at all. Sure i3 10100 has slightly higher clocks (200 MHz), but i7 6700 as 2MB more cache, that still works out better for i7 and you can also crank PL value and basically always run it at maximum boost clocks if you want. Considering how well that i3 handled RTX 2080 Ti, i7 can easily handle it too. RTX 2080 Ti is basically the same as RTX 3070 and you might still need to go further to actually find a bottleneck from CPU side. I'm telling you that i7 is still great.

Here are the tests of that i3:

You need RTX 3080 to finally see small differences, that are still not explained by core or thread count, only by differences in cache and perhaps higher boost on i7 or i5. OP is still more limited by budget and power supply than by his i7. He basically needs RTX 3080 tier card to finally see some shortcomings of his CPU. Considering 500 watt PSU, the most I could recommend for him safely with some safety margin is RTX 3060 Ti, RTX 2080 or RX 6700 XT and of course the fastest out of them is 6700 XT, not to mention, it has most vRAM. How it fares value vise I have no idea as I have never looked at prices of those cards, but if he wants something cheaper there are some great options like RX 6600.



It's exactly same arch tho, only Comet Lake got shaved CPU die and heightened IHS for better thermals, but that's it.

Sure it will, but it's a 10100, not a 6700. They may be similar, but at the same time, they can be quite different, mostly because of the reasons outlined in the bottom of this post. I mean, you've seen TPU's review I linked on the 10100, yes? It's decent, but that's about it. It was intended to go against the Ryzen 3 3100, which also doesn't really have a poor showing, except that the 10100 was a CPU you could actually buy :roll:

My personal recommendation was the RTX 3050. I simply agreed OP buying their friend's RX 590 (non-mined) was an acceptable solution. The 590 is a newer design on a 12 nm lithography and uses a bit less power/provides a bit more performance than the earlier Polaris models. They never stated how much they going to pay for their friend's card, but if it's up to $150? I think it's fair. The 6600 is also fair game if you can afford it, but on PCIe 3.0, I wouldn't be too comfortable saying that you'll get the best out of Navi 23.

As for SKL vs. CML, yeah, it's the same architecture. Plus years of security fixes in hardware bypassing costly microcode-based mitigations, better clock speeds, enhanced lithography, higher memory speed as a standard (which matters since the DRAM settings are locked down on non-Z chipset), etc.
 
Sure it will, but it's a 10100, not a 6700. They may be similar, but at the same time, they can be quite different, mostly because of the reasons outlined in the bottom of this post. I mean, you've seen TPU's review I linked on the 10100, yes? It's decent, but that's about it. It was intended to go against the Ryzen 3 3100, which also doesn't really have a poor showing, except that the 10100 was a CPU you could actually buy :roll:
Why are you so condescending about it? It's such a great value chip and it is fast. And you still fail to provide any reasons why would 6700 would be worse than it. I don't give a shit about what it was intended to go against, the fact is that in gaming it's basically as good as R9 3900X and all that tells us that gaming isn't very well threaded yet and that gaming doesn't require a shit ton of CPU performance for decent performance. In others words fuck off with RTX 3050s and take this i7 seriously like you should have been. We aren't talking about first gen i7, 6700 is still fast in games.

My personal recommendation was the RTX 3050. I simply agreed OP buying their friend's RX 590 (non-mined) was an acceptable solution. The 590 is a newer design on a 12 nm lithography and uses a bit less power/provides a bit more performance than the earlier Polaris models.
it was debunked that AMD lied about litography changes and that it was just better bins of RX 580 with monstrous power consumption and heat output. Also its arch is really old today. Sure it's good enough for gaming, but it can't even decode VP9 (about which AMD lied too). Considering AMD and how they axed driver support for earlier GCN cards, I wouldn't be too positive that AMD will act rationally and won't ax support for RX 590. I would definitely avoid it. RX 6600 is a lot better and won't turn into e-waste for much longer time, now that's a wise investment. RTX 3050 is also a great card, which is priced horribly. It should be around 30-40% cheaper than RX 6600 to be competitive, but it's not and that makes zero sense.


They never stated how much they going to pay for their friend's card, but if it's up to $150? I think it's fair. The 6600 is also fair game if you can afford it, but on PCIe 3.0, I wouldn't be too comfortable saying that you'll get the best out of Navi 23.
It will be perfectly fine, even 6500 XT isn't exactly horrible with gen 3. And 6600 has twice the lanes. There's also GTX 1650 GDDR6 if you are paranoid about that. But anyway, shouldn't be a big concern.

As for SKL vs. CML, yeah, it's the same architecture. Plus years of security fixes in hardware bypassing costly microcode-based mitigations, better clock speeds, enhanced lithography, higher memory speed as a standard (which matters since the DRAM settings are locked down on non-Z chipset), etc.
i7 6700 still has more cache than i3 10100, which basically cover clock speed difference. Faster RAM support means nothing, if OP won't upgrade it. Lithography wasn't enhanced at all, just yields and bins got a tiny bit better over time. I would like a link about those security fixes. Either way, these are just strawman argument and you still seem to not accept the fact that i7 6700 is still good for gaming. OP should just buy as much GPU as he can afford, that's it.
 
Sure it will, but it's a 10100, not a 6700. They may be similar, but at the same time, they can be quite different, mostly because of the reasons outlined in the bottom of this post. I mean, you've seen TPU's review I linked on the 10100, yes? It's decent, but that's about it. It was intended to go against the Ryzen 3 3100, which also doesn't really have a poor showing, except that the 10100 was a CPU you could actually buy :roll:

My personal recommendation was the RTX 3050. I simply agreed OP buying their friend's RX 590 (non-mined) was an acceptable solution. The 590 is a newer design on a 12 nm lithography and uses a bit less power/provides a bit more performance than the earlier Polaris models. They never stated how much they going to pay for their friend's card, but if it's up to $150? I think it's fair. The 6600 is also fair game if you can afford it, but on PCIe 3.0, I wouldn't be too comfortable saying that you'll get the best out of Navi 23.

As for SKL vs. CML, yeah, it's the same architecture. Plus years of security fixes in hardware bypassing costly microcode-based mitigations, better clock speeds, enhanced lithography, higher memory speed as a standard (which matters since the DRAM settings are locked down on non-Z chipset), etc.

I can't recommend a $300 used 3-4 yo gpu with no warranty when there are brand new 1s for $9-13 more with a warranty.

So I say RX 6600 or better, it's not so over the top that a 2017 Processor can't handle and can be moved to a new rig at the time.
 
Why are you so condescending about it? It's such a great value chip and it is fast. And you still fail to provide any reasons why would 6700 would be worse than it. I don't give a shit about what it was intended to go against, the fact is that in gaming it's basically as good as R9 3900X and all that tells us that gaming isn't very well threaded yet and that gaming doesn't require a shit ton of CPU performance for decent performance. In others words fuck off with RTX 3050s and take this i7 seriously like you should have been. We aren't talking about first gen i7, 6700 is still fast in games.


it was debunked that AMD lied about litography changes and that it was just better bins of RX 580 with monstrous power consumption and heat output. Also its arch is really old today. Sure it's good enough for gaming, but it can't even decode VP9 (about which AMD lied too). Considering AMD and how they axed driver support for earlier GCN cards, I wouldn't be too positive that AMD will act rationally and won't ax support for RX 590. I would definitely avoid it. RX 6600 is a lot better and won't turn into e-waste for much longer time, now that's a wise investment. RTX 3050 is also a great card, which is priced horribly. It should be around 30-40% cheaper than RX 6600 to be competitive, but it's not and that makes zero sense.



It will be perfectly fine, even 6500 XT isn't exactly horrible with gen 3. And 6600 has twice the lanes. There's also GTX 1650 GDDR6 if you are paranoid about that. But anyway, shouldn't be a big concern.


i7 6700 still has more cache than i3 10100, which basically cover clock speed difference. Faster RAM support means nothing, if OP won't upgrade it. Lithography wasn't enhanced at all, just yields and bins got a tiny bit better over time. I would like a link about those security fixes. Either way, these are just strawman argument and you still seem to not accept the fact that i7 6700 is still good for gaming. OP should just buy as much GPU as he can afford, that's it.

Well, it's a long answer, so i'm going to break it down for readability.

1. I'm not condescending, i'm simply calling a spade a spade. A 6700 is an old processor now, and being of the locked variant on a low-end motherboard does it no favor. No comment on the language, being angry about it won't change anything.

2. Uh, source about debunking? Polaris 30 is literally an all-new die on an enhanced lithography node. Polaris 30 is the RX 590. I also never said anything against the idea of buying an RX 6600, though it is pricier and will leave performance on the table. The arch is old, but it is still supported, and at this performance segment, things like ray tracing and DirectX 12 Ultimate aren't really a must-have, IMHO.

3. True, not horrible, but also not excellent, OP wanted something that's on the leaner side, probably because they have no intention of upgrading in the future again without a full computer replacement.

4. I was citing the advantages of the i3-10100 over the i7-6700, all of which are easily verifiable because, well, it is what it is. However, OP is not replacing their 6700 at all. A 6700 will not perform like a 10100, even if its cache is bigger, for the aforementioned reasons. Unless you literally disable all of the microcode-level bugfixes, disable Spectre and Meltdown protection in Windows, and live with the erratum...

I can't recommend a $300 used 3-4 yo gpu with no warranty when there are brand new 1s for $9-13 more with a warranty.

So I say RX 6600 or better, it's not so over the top that a 2017 Processor can't handle and can be moved to a new rig at the time.

Agreed, for $300, the 590 is indeed a raw deal. But if OP can buy their friend's card for 150 and pocket the rest, i'd happily do just that.
 
Last edited:
Buy a 3060 from any manufacture. It is the cheapest high performing 1080p card right now. I would not go any higher than that though.

If you are looking in the used market there are lots of 2070s out there for around 300 bucks.
 
OK. I have lots of good suggestion here now. Thank you all for your advice. I will sit down and use this info to make my decision. Thank you all very much.
 
Well, last year I had a heavily overclocked 3770k (4.7ghz with 2133 RAM) in a system with a GTX 1080 (the motherboard died last fall). The 3770k handled the GTX 1080 fine for the most part but there were some framerate drops here and there. When the GTX 1080 came out in 2016 there weren't too many games that needed a 4c/8t CPU, and back then a 4c/4t i5 4th or 6th Gen CPU was fine with a GTX 1080. Nowadays Game engines are using 8 or 12 threads and the old 4c/4t CPU are practically obsolete. A lot of times you'll get a good average framerate, but these low thread CPU are more prone to framerate dips. It depends on the game though. A GTX 1080, RTX 2060, RX 6600 would be fine. I wouldn't go beyond the level of a 3060, 2060 Super, or as high as a 1080ti, 2070 Super.
 
1. I'm not condescending, i'm simply calling a spade a spade. A 6700 is an old processor now, and being of the locked variant on a low-end motherboard does it no favor. No comment on the language, being angry about it won't change anything.
Doesn't matter much for performance in games in this particular case and yes I'm angry, because you are essentially talking crap about how old it is and heavily underestimating it.

2. Uh, source about debunking? Polaris 30 is literally an all-new die on an enhanced lithography node. Polaris 30 is the RX 590. I also never said anything against the idea of buying an RX 6600, though it is pricier and will leave performance on the table. The arch is old, but it is still supported, and at this performance segment, things like ray tracing and DirectX 12 Ultimate aren't really a must-have, IMHO.
RX 6600 will be a huge bottleneck by itself. You still don't understand that this CPU is fast for gaming and in no way holds back cards like that. RX 6600 sure does have many new features and things, but importantly it's all about value. It consumes less power than RX 580 and it's nearly twice as fast. If it's worth ~300 USD, then RX 580 should be worth around 150, but they are going for 200 USD and even then 150 is just by performance numbers alone, it lacks many modern decoders, it lacks ray tracing support, it sucks more power and AMD may stop driver support soon or at least much sooner than RX 6600, therefore, nobody should buy those old Polaris refresh cards at this points, they are simply too old to be viable as new card and certainly won't meet 1440p gaming requirement of OP. I know I have RX 580 and play at 1440p, but it's starting to become only viable in some games. LCD doesn't run well at 1440p lowest. Games like Cyberpunk or RDR are probably completely out of question.

As for 12nm, well yeah it technically got it, but in name only. Turns out that what was labeled as 12nm wasn't really better than 14nm labelled node. This is how it happens:

Here, educate yourself. Basically those names are all scam until we get some real numbers, which of course companies won't disclose years after their relevance.

3. True, not horrible, but also not excellent, OP wanted something that's on the leaner side, probably because they have no intention of upgrading in the future again without a full computer replacement.
RX 6600 it is then.

4. I was citing the advantages of the i3-10100 over the i7-6700, all of which are easily verifiable because, well, it is what it is. However, OP is not replacing their 6700 at all. A 6700 will not perform like a 10100, even if its cache is bigger, for the aforementioned reasons. Unless you literally disable all of the microcode-level bugfixes, disable Spectre and Meltdown protection in Windows, and live with the erratum...
I'm 100% sure it will perform as well as i3 10100, because cache size matters and you are thinking it doesn't. And those security patches weren't as bad as you think and I bet that Comet Lake still doesn't have them neatly fixed.
 
His i7 6700 performance is irrelevant, the limiting factor is the $300 budget. There is no sense arguing about whether the i7 is capable. The best card that fits that budget constraint is the RTX 2060 Super, or a RX 6600, or RTX 3050.
 
Doesn't matter much for performance in games in this particular case and yes I'm angry, because you are essentially talking crap about how old it is and heavily underestimating it.


RX 6600 will be a huge bottleneck by itself. You still don't understand that this CPU is fast for gaming and in no way holds back cards like that. RX 6600 sure does have many new features and things, but importantly it's all about value. It consumes less power than RX 580 and it's nearly twice as fast. If it's worth ~300 USD, then RX 580 should be worth around 150, but they are going for 200 USD and even then 150 is just by performance numbers alone, it lacks many modern decoders, it lacks ray tracing support, it sucks more power and AMD may stop driver support soon or at least much sooner than RX 6600, therefore, nobody should buy those old Polaris refresh cards at this points, they are simply too old to be viable as new card and certainly won't meet 1440p gaming requirement of OP. I know I have RX 580 and play at 1440p, but it's starting to become only viable in some games. LCD doesn't run well at 1440p lowest. Games like Cyberpunk or RDR are probably completely out of question.

As for 12nm, well yeah it technically got it, but in name only. Turns out that what was labeled as 12nm wasn't really better than 14nm labelled node. This is how it happens:

Here, educate yourself. Basically those names are all scam until we get some real numbers, which of course companies won't disclose years after their relevance.


RX 6600 it is then.


I'm 100% sure it will perform as well as i3 10100, because cache size matters and you are thinking it doesn't. And those security patches weren't as bad as you think and I bet that Comet Lake still doesn't have them neatly fixed.

You're taking conjecture at face value, that video has no relevance to the argument at hand and also wrong, Comet Lake does have them fixed, it was somewhat of the point of the second-wave fixed stepping Coffee Lake and all of the Comet chips. But it's just a dragged on argument now.

You won't concede, and I won't either... the 6700 is now almost a full seven generations old and it's totally not a fast processor anymore, no matter how much you want it to be. OP's main rig's 5600G (as stated in their system specs) is going to eat it for lunch. So it's no longer productive. OP also already got the advice they needed and thanked us for it, so let us go our separate ways.
 
never said that.
i just showed him that he even could buy a 3090 Ti for his 6700.
at 1440p it would be absolutely no problem in 9/10 games.

if he played any competitive modern game it would make no sense to buy that card. But even for the vast majority of games except CP77 on ultra it makes no sense and it would be a lousy experience for the money spent
 
You won't concede, and I won't either... the 6700 is now almost a full seven generations old and it's totally not a fast processor anymore, no matter how much you want it to be. OP's main rig's 5600G (as stated in their system specs) is going to eat it for lunch. So it's no longer productive. OP also already got the advice they needed and thanked us for it, so let us go our separate ways.
So what if it's old and not as fast as latest Lisa's jizz? It's still fast for gaming and more than good enough and it won't bottleneck fast cards. I already showed you evidence of that and you still keep foaming about it being old and shit.
 
For $300, I would say a 2060 super.

I am running a 3060 and 8600K, I think the 8600K is the limiting point in my system presently, for most games I run 1440P and medium settings to try for 100+ FPS. I only mention this because the 8600K and the 6700K are similar performers.

depends on the exact game, my RTX 3060 won't even hit 100 FPS at 1440p with some games and I have two more real cores than you
 
My 2 year old RTX 2080 SUPER works very well with my aged 2700K if that helps?
Really? I don’t think it does. I had a 3960X which is far better than a 2700K and it throttled a 1080 Ti hard, which on itself is also slower than a 2080 Super, by about 10%. This happened in Battlefield 5, as soon as you start a game which needs cpu and gpu performance it will choke hard. A few years ago 2700K would’ve been mostly fine but not today, many games will choke hard with this ancient cpu.
people underestimate 1440p a lot.
you could throw a 3090 Ti in that PC and still have almost all games max out the GPU.
Here is a 4790K (overclocked) with a 3090 at 1080p. (1440p would be at least be 33% more load)
4790K isn’t a good fit for a 3090. 3090 needs the best CPUs to not throttle in 1080p/1440p, anything less than a 3700X will choke it in 1440p, and in 1080p it even throttles with a Alder Lake.
what?! you do not need a 3090ti for 1440p that's insane, what are you doing playing CP77 on ultra? you don't even need a 3080.

"More power more better" but that's insane.
It’s not insane, it’s just better. CP2077 is one game but there are more hungry games and more will come soon as well.
From experience as someone who's owned a 3090 since launch day, Ampere scales very poorly to lower resolutions, and depending on the game, that's 1440p included. For 1080p you're looking at barely the same performance that an RX 6800 is going to achieve, sometimes less. Frame rates will not be anywhere near as nice as they should be. This isn't a CPU bottleneck, it's an internal architecture issue with super-wide GPUs like this, you've seen similar scalability issues before in Radeon Fiji and Vega architectures.

The RDNA2 architecture works differently, it performs better at 1080p and loses gas as you crank the resolutions up. The impact isn't anywhere near as bad on Ampere, actually, you'll find the odd situation that sometimes increasing the resolution *helps* the overall experience with an RTX 3090. I feel like these videos of "get old CPU from year X" and tack on a 3090 in it are rather meaningless because of that.
It is a cpu bottleneck. Nvidia still uses software scheduler, this means it can not tap into cpu power as well as AMD, in dx12 or Vulkan games. That’s why 3090 gets easily overtaken even by old AMD GPUs if you pair it with something like a 1600X. I can link you the video of hardwareunboxed where they proved this, if you want.
That i3 performs basically the same as 3900X
No it doesnt. The 3900X is comparable to a 8700K in some scenarios to a 9900K, the i3 will choke hard in games like BF5 that need a lot of cpu power. It can’t feed the game and gpu at the same time with those lousy 4 cores.
 
I mean, that video proves my point precisely, the X5675 is coming in last in every benchmark he's run and percentually, losses are approaching 40%... i'll agree that the frame rates you'll get are "fine" (disregarding the rest), but you're wasting the GPU. At the end of the video he straight up says that using the 3090 with that processor knocks it down to the RTX 2080 Super's level, and also approached the same hypothetical I mentioned earlier: some games won't run at all because of AVX not being supported.

OP simply wants a practical upgrade that will be as light on the wallet as possible and not leave much on the table, so I think i'll stand by my recommendation of the 3050. :)

Good idea but wrong card, 3050 is still a bad value everywhere I see it. Better to pay up for a 6600/3060/6600xt or down to a 2060.
 
No it doesnt. The 3900X is comparable to a 8700K in some scenarios to a 9900K, the i3 will choke hard in games like BF5 that need a lot of cpu power. It can’t feed the game and gpu at the same time with those lousy 4 cores.
I mean, if you call 100+ fps experience as choking, I have no words:
 
I mean, if you call 100+ fps experience as choking, I have no words:
I meant combined with a very strong GPU like 2080 Ti and upwards. Ofc it can feed a 2060.
 
I meant combined with a very strong GPU like 2080 Ti and upwards. Ofc it can feed a 2060.
Just use 4k with settings cranked then. Of course it will run literally the same.
 
Just use 4k with settings cranked then. Of course it will run literally the same.
Even then 10100 isn’t good enough to feed the high end GPUs, you’re overestimating 4K, it’s not a magical “CPU bottleneck lifter”. You’ll most likely lose 10-20% performance. As always PCs should have a balanced setup between CPU and GPU performance.
 
It is a cpu bottleneck. Nvidia still uses software scheduler, this means it can not tap into cpu power as well as AMD, in dx12 or Vulkan games. That’s why 3090 gets easily overtaken even by old AMD GPUs if you pair it with something like a 1600X. I can link you the video of hardwareunboxed where they proved this, if you want.

Oh, yes, I am quite aware of the NVIDIA's software scheduling, in fact, while it holds back DX12 a bit in some scenarios, it is also what makes NVIDIA indomitable in DirectX 11

I have an interesting read for you too, on this subject. Written by an Intel developer, no less.


This is related to the draw calls and immediate vs. deferred GPU contexts :)

However the behavior I refer to isn't CPU-related at all, it will show regardless of API, it is very much a hardware thing, Ampere's frontend just isn't wide enough.

Good idea but wrong card, 3050 is still a bad value everywhere I see it. Better to pay up for a 6600/3060/6600xt or down to a 2060.

Perhaps, I would be content dropping about ~220 Bidens on one. The NVENC/NVDEC engine is worth it, and it's pretty lean on power.

Just use 4k with settings cranked then. Of course it will run literally the same.

I'm not entirely sure where the idea that using 4K is a fix for slow CPUs came from, but it's a lot more complex than just that... besides a system that has an i3 processor will rarely have an adequate GPU for 4K.
 
Even then 10100 isn’t good enough to feed the high end GPUs, you’re overestimating 4K, it’s not a magical “CPU bottleneck lifter”. You’ll most likely lose 10-20% performance. As always PCs should have a balanced setup between CPU and GPU performance.
CPU reaches over 100 fps most of the time, as long as it is a limiting factor, upgrading GPU basically means higher settings and resolution with no performance impact due it not being a limiting factor.

I'm not entirely sure where the idea that using 4K is a fix for slow CPUs came from, but it's a lot more complex than just that... besides a system that has an i3 processor will rarely have an adequate GPU for 4K.
So how much fps you get, 1000 or 2000? It's like 100 fps is "low" nowadays to you or something.
 
Oh, yes, I am quite aware of the NVIDIA's software scheduling, in fact, while it holds back DX12 a bit in some scenarios, it is also what makes NVIDIA indomitable in DirectX 11
Yea well, this isn’t 2015 anymore, DX11 is quite irrelevant now. And AMD, despite the low relevance of DX11, tweaked their drivers for DX11 now as well. It’s simply better to buy AMD if you plan to use a old CPU.
CPU reaches over 100 fps most of the time, as long as it is a limiting factor, upgrading GPU basically means higher settings and resolution with no performance impact due it not being a limiting factor.


So how much fps you get, 1000 or 2000? It's like 100 fps is "low" nowadays to you or something.
Nobody will pair a 10100 with a 3090, as, again, it’s a terrible pairing and the 3090 will even lose performance in 4K. You don’t buy a 3090 to lose 10-20% of its performance with a old or weak CPU, nobody does that. And it’s not about “100 fps”, it’s about using the system to its fullest ability.
 
Back
Top