Tuesday, August 16th 2022

Six Year Old GTX 1060 Beats Intel Arc A380, GeForce GTX 1630 and Radeon RX 6400, Wins TPU popularity contest

NVIDIA GeForce GTX 1060 6 GB "Pascal" continues to be a popular choice among TechPowerUp readers as an entry-mainstream graphics card choice over rivals that are two generations ahead. The recent TechPowerUp Frontpage Poll asked our readers what graphics card they'd choose, assuming they're priced the same, with choices that include the GTX 1060 6 GB, GTX 1630 4 GB, GTX 1650 4 GB, RX 570 4 GB, RX 5500 XT 4 GB, RX 6400 4 GB, and the A380 6 GB. The poll received great response, with over 18,200 votes cast since it went live on June 30, 2022, closing on August 16.

The GeForce GTX 1060 6 GB dominated the poll, and nearly scored a simple majority, with 49 percent of the respondents, or 8,920 people, saying they'd choose the card over the others. A distant second was the RX 5500 XT 4 GB, with 15.1 percent, or 2,749 votes. The GTX 1650 and Arc A380 are nearly on par, with 11,9 percent, or around 2,170 votes. The remaining options, including the RX 6400, RX 570, and GTX 1630, are marginal, single-digit percentage choices.
The GTX 1060 6 GB is now over six years old, having launched in July 2016. It's based on the 16 nm "Pascal" graphics architecture, which has since been succeeded by two generations—the 12 nm "Turing" and the 8 nm "Ampere." With DirectX 12 feature-level 12_1 support, the card supports nearly all of the current online FPS, MOBA, and MMORPGs with reasonably good settings, at Full HD (1080p), which strikes at the core of the PC gaming market, or the very top of the bell-curve. Unfortunately, the GTX 1060 is retired from NVIDIA's product stack, although the latest GeForce Game Ready drivers continue to support it. You may still find the card in the second-hand market on eBay where it can be had for well under $200.

What's more interesting is that the GTX 1060 beats every AMD rival hollow, including the RX 5500 XT that's based on the 7 nm RDNA architecture, and the newer RX 6400, based on the 6 nm RDNA2. Although barely available in the West, the Intel Arc A380 appears to be riding on some novelty value, with people eager to check out the capabilities of Intel's latest 6 nm Xe-HPG "Alchemist" graphics architecture.

Catch the TechPowerUp Reviews of each card from this poll:

GeForce GTX 1060 6 GB |GeForce GTX 1630 4 GB | GeForce GTX 1650 4 GB | Radeon RX 570 4 GB | Radeon RX 5500 XT 4 GB | Radeon RX 6400 4 GB | Arc A380 6 GB
Add your own comment

41 Comments on Six Year Old GTX 1060 Beats Intel Arc A380, GeForce GTX 1630 and Radeon RX 6400, Wins TPU popularity contest

#26
Chrispy_
Dr. DroI disagree. Pascal owners love their cards for good reason - but I keep seeing people swearing by these ancient graphics cards as if they were special. You simply need to experience Ampere if you think that's something to write home about. You'll find more than adequate a match for a GTX 1070 on an RTX 3050. Maybe even a 1070 Ti; and that's on the games that Pascal can run decently (DX11).
I just sold a GTX 1080 for £200. It's better than anything else you can buy for £200 right now, so as far as I'm concerned the buyer got a good deal. 3050 is almost 50% more than that and 3060 is almost double.

If it were me buying a new card right now I'd be grabbing a new RX 6600 8GB, you can find them in the UK for £250 and they're far and away the best performance/£ for anything new right now. They're not 25% faster than a 1080 on average though, so despite that - a £200 GTX 1080 is still a reasonable if you either don't have a £250 budget, or if you're comparing it to any Nvidia card (perhaps you want the drivers, the encoder, CUDA support - whatever...)
Posted on Reply
#27
Dr. Dro
Chrispy_I just sold a GTX 1080 for £200. It's better than anything else you can buy for £200 right now, so as far as I'm concerned the buyer got a good deal. 3050 is almost 50% more than that and 3060 is almost double.

If it were me buying a new card right now I'd be grabbing a new RX 6600 8GB, you can find them in the UK for £250 and they're far and away the best performance/£ for anything new right now.
That may be so, but at the same time, the buyer probably isn't going after state of the art technology, which is somewhat of the point of my argument in favor of the Arc GPU. It's the one with the freshest technology in it, even if the drivers are not yet ready. These things aren't developed overnight, and I think it's quite unfair to measure up this infant product stack vs. the decades of driver expertise you will find in both NV and AMD GPUs right now.

Other than that, both Ampere and RDNA 2 products are on the same list, and at this kind of price point, there are things that I personally value more than raw power. But you do you fellas, that's the beauty of the PC, end of the day, what makes it special is that you can tailor it to your own needs.
Posted on Reply
#28
Chrispy_
Dr. DroThat may be so, but at the same time, the buyer probably isn't going after state of the art technology, which is somewhat of the point of my argument in favor of the Arc GPU. It's the one with the freshest technology in it, even if the drivers are not yet ready. These things aren't developed overnight, and I think it's quite unfair to measure up this infant product stack vs. the decades of driver expertise you will find in both NV and AMD GPUs right now.

Other than that, both Ampere and RDNA 2 products are on the same list, and at this kind of price point, there are things that I personally value more than raw power. But you do you fellas, that's the beauty of the PC, end of the day, what makes it special is that you can tailor it to your own needs.
I agree with you about the freshest tech and feature set, but I don't think Intel's 1st-gen GPUs are the ones to buy for that. By the time their drivers are in a decent state, and by the time DX9 and DX11 performance is less relevant, this first-gen will already be retired/obsolete.

We know there are uncorrectable errors in the silicon itself (Intel admitted as much) and I believe that their second attempt next generation will be the one to seriously consider. They're still new at dGPUs and this first gen is full of mistakes and things that qualify as "trial and error" lessons.

We just have to hope that intel persist with this loss-leader and make it to a second generation. The fear is that it won't turn an immediate profit and so the stupid board of directors will just bow to shortsighted shareholder pressure to can the whole GPU lineup. Intel's expertise and in-house fab could mean that Intel become the #1 player in time, but it could easily take a decade for that to happen, if it happens at all. They have the money and they have the supply chain. All they need is commitment and persistence.
Posted on Reply
#29
Dr. Dro
Chrispy_I agree with you about the freshest tech and feature set, but I don't think Intel's 1st-gen GPUs are the ones to buy for that. By the time their drivers are in a decent state, and by the time DX9 and DX11 performance is less relevant, this first-gen will already be retire/obsolete.

We know there are errors in the silicon itself (Intel admitted as much) and I believe that their second attempt next generation will be the one to seriously consider. They're still new at dGPUs and this first gen is full of mistakes and things that qualify as "trial and error" lessons.
Agreed, it is a bumpy start, but it is a start nonetheless. One must crawl before they can walk. I am sure many people thought the same when the GeForce 256 and the Radeon launched back in 1999-2000, or when unified shader cards launched with the G80 in 2006. New tech often comes with big changes, and change can understandably be scary.

RDNA 2 is by all means a polished and well maintained architecture, so I'd nudge people who want a more carefree experience towards it.

Those who seek raw performance will weigh their choices and take any path they deem best (such as buying older, less efficient but powerful hardware, even if they're behind in features or are approaching their end of life), and then there are those like me. I'm excited by new technology, and this matters more to me, and that's why I like to keep things fresh :)

I can't wait for RDNA 3, for example. I'm gonna have a field day with it.
Posted on Reply
#30
ThrashZone
Hi,
GPU's are just now are seeing price reductions so repeating vnidia PR line "just buy it" for an rtx cards is insane.

I won't be buying a new gpu until I have a real need for one so 980ti/ 1080ti/ titanXp will still do their thing until they can't anymore and I sure would waste any time or money with a 3050 or 3060 lol that's just crazy talk, unless of course if they came on a laptop but for a desktop that's just funny :laugh:
Posted on Reply
#31
Dr. Dro
ThrashZoneHi,
GPU's are just now are seeing price reductions so repeating vnidia PR line "just buy it" for an rtx cards is insane.

I won't be buying a new gpu until I have a real need for one so 980ti/ 1080ti/ titanXp will still do their thing until they can't anymore and I sure would waste any time or money with a 3050 or 3060 lol that's just crazy talk, unless of course if they came on a laptop but for a desktop that's just funny :laugh:
Correct me if I'm wrong, but don't you use Windows 7? If your operating system doesn't support any of these newer graphics APIs, there is really nothing for you to see. It stands to reason that the 10 series would be fully capable of offering everything that the WDDM 1.1 model was designed to do.

My argument is by no means the same as Avram Piltch's infamous just buy it pitch, since we're talking about budget hardware. I fully realize that I'm playing devil's advocate.
Posted on Reply
#32
ThrashZone
Dr. DroCorrect me if I'm wrong, but don't you use Windows 7? If your operating system doesn't support any of these newer graphics APIs, there is really nothing for you to see. It stands to reason that the 10 series would be fully capable of offering everything that the WDDM 1.1 model was designed to do.

My argument is by no means the same as Avram Piltch's infamous just buy it pitch, since we're talking about budget hardware. I fully realize that I'm playing devil's advocate.
Hi,
There are win-7 drivers for 20 & 30 series so not sure what your point is
My point is your statement is just wrong think maybe the devil made you do it stands to reason :laugh:

I use 7 - 10 and 11.
Posted on Reply
#33
Dr. Dro
ThrashZoneHi,
There are win-7 drivers for 20 & 30 series so not sure what your point is
My point is your statement is just wrong think maybe the devil made you do it stands to reason :laugh:
The point is that even if drivers exist (even disregarding the fact that they are out of date), it doesn't mean you can use every functionality available because your OS is not aware of what these features are or what they are for.

You being blissfully oblivious to what I speak of doesn't make me wrong, but I'll tell you this, just having multiplane overlay support makes the upgrade worth it in my eyes. MPOs are unfortunately not supported in Pascal, though AMD does support this in Vega.
Posted on Reply
#34
Bomby569
1060 6gb and 1080ti will never die, iconic beasts
Posted on Reply
#35
pavle
Dr. Dro...These things aren't developed overnight, and I think it's quite unfair to measure up this infant product stack vs. the decades of driver expertise you will find in both NV and AMD GPUs right now...
I feel I have to reply to this - who has been the leading graphics chip producer (regardless of the GPU size) for ages and with the biggest market share? Intel.
If they aren't able to produce stable drivers now, it's not likely they will any time soon.
If we look at nvidia with their GeForce256 and up - they already had very stable drivers even with Riva TNT2, much more so with GF1 (256), it is not that GF256 was anything new, just continuation and upgrade of their architecture and all higher chips too (quite an achievement really, the key is probably discipline which Intel lacks).
Posted on Reply
#36
Bomby569
pavleI feel I have to reply to this - who has been the leading graphics chip producer (regardless of the GPU size) for ages and with the biggest market share? Intel.
If they aren't able to produce stable drivers now, it's not likely they will any time soon.
If we look at nvidia with their GeForce256 and up - they already had very stable drivers even with Riva TNT2, much more so with GF1 (256), it is not that GF256 was anything new, just continuation and upgrade of their architecture and all higher chips too (quite an achievement really, the key is probably discipline which Intel lacks).
Developing igpu's is not the same thing.

"Our software release on our discrete graphics was clearly underperforming," said Gelsinger. "We thought that we would be able to leverage the integrated graphics software stack, and it was wholly inadequate for the performance levels, gaming compatibility, etc. that we needed. So we are not hitting our four million unit goal in the discrete graphics space, even as we are now catching up and getting better software releases."

Riva was done at a different time, everything was much simpler. And both Nvidia and especially AMD even with decades of experience with high performance drivers still make same spectacularly shitty drivers. You can easilly search AMD releasing updated graphics for a single game giving 30% more performance (and i remenber cases of 40% and more in my time with AMD)

"you have an AMD GPU with a product name starting ‘Radeon RX 6…’ the driver should deliver the following performance improvements in these games:
  • World of Warcraft: Shadowlands – up to 30%
  • Assassin’s Creed Odyssey – up to 28%"
www.ccleaner.com/knowledge/amd-driver-update-improve-performance
Posted on Reply
#37
Dr. Dro
Bomby569Developing igpu's is not the same thing.

"Our software release on our discrete graphics was clearly underperforming," said Gelsinger. "We thought that we would be able to leverage the integrated graphics software stack, and it was wholly inadequate for the performance levels, gaming compatibility, etc. that we needed. So we are not hitting our four million unit goal in the discrete graphics space, even as we are now catching up and getting better software releases."

Riva was done at a different time, everything was much simpler. And both Nvidia and especially AMD even with decades of experience with high performance drivers still make same spectacularly shitty drivers. You can easilly search AMD releasing updated graphics for a single game giving 30% more performance (and i remenber cases of 40% and more in my time with AMD)

"you have an AMD GPU with a product name starting ‘Radeon RX 6…’ the driver should deliver the following performance improvements in these games:
  • World of Warcraft: Shadowlands – up to 30%
  • Assassin’s Creed Odyssey – up to 28%"
www.ccleaner.com/knowledge/amd-driver-update-improve-performance
Yup, and the biggest mistake with Arc is that whoever was in charge of the software development (folks blaming Raja again, he's like the boogeyman or something) thought they could leverage their existing driver code base for their latest generation integrated graphics and work from there. Let's just say that wasn't such a great idea
Posted on Reply
#38
Bomby569
Dr. DroYup, and the biggest mistake with Arc is that whoever was in charge of the software development (folks blaming Raja again, he's like the boogeyman or something) thought they could leverage their existing driver code base for their latest generation integrated graphics and work from there. Let's just say that wasn't such a great idea
thought they could is a weird thing. What were they doing, i'm sure they had a prototype for so long now, i can't understand how any of this came as a surprise for Intel. Back in February they were promising us the moon.

The argument is the igpu's never recieved optimized drivers, so wtf why didn't they tried that first before jumping head first into making new gpu's with drivers they didn't get to optimise on the existing igpu's. There's an insane amount of bad workmanship and leadership at Intel.
Posted on Reply
#39
Dr. Dro
Bomby569thought they could is a weird thing. What were they doing, i'm sure they had a prototype for so long now, i can't understand how any of this came as a surprise for Intel. Back in February they were promising us the moon.

The argument is the igpu's never recieved optimized drivers, so wtf why didn't they tried that first before jumping head first into making new gpu's with drivers they didn't get to optimise on the existing igpu's. There's an insane amount of bad workmanship and leadership at Intel.
I totally imagine that this preposterous idea was imposed by the suits that Gelsinger delegated to take care of the graphics division as he restructured the company.

I'm also fairly sure and inclined to believe that they were warned by the engineers but executives haven't the faintest clue, and probably saw that their integrated graphics ran CSGO and Dota and refused to allow development of an all new stack, which is a millionaire investment. Except that this time they can't just call in the Linux wiz kids they hire to maintain their open source iGPU drivers, so it all fell completely on top of the hardware itself.

At least Gelsinger is personally owning up to that mistake, and I strongly feel that he's done Intel very well in his tenure as CEO thus far.
Posted on Reply
#40
Bomby569
Maybe it's a culture thing inside Intel, but in my company we are encouraged to point out things like this, that can be the difference between a lot of money won or lost, and we probably get a reward.
This company has a massive problem inside.
Posted on Reply
#41
Dr. Dro
Bomby569Maybe it's a culture thing inside Intel, but in my company we are encouraged to point out things like this, that can be the difference between a lot of money won or lost, and we probably get a reward.
This company has a massive problem inside.
Most mega corps have people in charge that shouldn't be, you know, the usual. But hopefully it's a learning experience.
Posted on Reply
Add your own comment
Dec 4th, 2024 03:29 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts