• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Upgrade for a GTX-1060 video card to a X570 AM4 MB w/ a Ryzen 9 3900X

Joined
Oct 4, 2013
Messages
203 (0.05/day)
I see the GTX-1060 series is still alive and well, but I want to step up to something that would not be a 'bottleneck for that processor.
I will be running SSD's on W10, nothing overclocked. I realize this is a tough question since there are so many models, many very similar to others.

I was originally looking a AMD chipset card, but I understand Nvidea has a advantage here. I do NOT "game", but I do video editing, but mostly towards to lower end (just above basic. So just how far up the ladder do I have to climb here, I'm not looking for cheap, but surely not anywhere near the higher end, just a noticeable improvement from the AMD FX-8350 AM3+ system I had.

I've been looking at a number of benchmark sites with that being mind boggling as it is. I hope this made some sense, if not, ask away.
 
Bottleneck it for what? What video editing software are you using and is it zero gaming ?
 
The weakest link in the chain.
I was just going to add, the video S/W I use (Movavi VideoEditorPlus v15, a older version) does benefit from hardware acceleration. I tested it with and w/o and with cuts 50% off the processing time.
 
How many forum threads do you need?


 
The more the better, it's called additional opinions. Do you take the first few responses you get from a question?
 
Puget Systems has a Da Vinci benchmark, and as for pairing with the 3900x I'd say something around RTX3080 levels.
 
In that series, how about the 3050? It seems all the models above the 3050 really get up there in price. I was looking to stay around $200. That 3080 is around 4x that price. :(
 
Knowing where you are located at is very helpful to picking parts, otherwise, it is very hard to assess what is available to you and in what currency. Where you intend to buy parts from is also useful (amazon,newegg, etc).
 
Last edited:
In that series, how about the 3050? It seems all the models above the 3050 really get up there in price. I was looking to stay around $200. That 3080 is around 4x that price. :(
I sold my 3080 for 400 which is slightly cheaper than eBay last I saw. I wouldn't get a 3050 with how much they cost and how little performance you get.

Maybe see how a Titan Xp performs compared to a 3060(ti)/3070. They're fairly cheap and solid.
 
Knowing where you are located at is very helpful to picking parts, otherwise, it is very hard to assess what is available to you and in what currency. Where you intend to buy parts from is also useful (amazon,newegg, etc).
NY State
AFA source, the 2 main sources as you mentioned, or any other reputable dealer/seller.

What does the "ti' stand for?
 
Last edited:
NY State
AFA source, the 2 main sources as you mentioned, or any other reputable dealer/seller.

What does the "ti' stand for?
Titanium, or it used to. Think of it as the half step up over what the normal one is.

Like a 3060ti kinda like a 3065. It's just a marketing thing NVIDIA has done for years and it used to be paired with the top of the top. (780TI, 980TI, 1080TI, 2080TI, 3090TI) There also was that small bump for mid range like 660ti, 760ti, 3060ti. They added Super in the naming in 20/16 series which was dumb as hell.

Then NVIDIA being a green head stopped following their own naming. We got 4070Ti Super. Shit is dumb as hell.
 
$200 bucks is pretty tight for a gpu right this moment, and pickings are slim.

An 8Gb 3050 would be a substantial upgrade from the 1060. I wouldn't consider the 6gb 3050 though.

I'm seeing new and open box 6650xt and rx7600 in the same price range as new 8gb 3050 ($200-$300).

One of them would probably be my preference, but I am thoroughly red team.

There are lots of used and refurbs available in the $200-$300 dollar price range. There are some used msi 6700xt on eBay for $195. It would out run most anything else I'm seeing at this price. In my opinion, that would be a good match for a 3900x and x570.

I wouldn't get a used one, but that is just me. I've certainly worked with plenty used and refurb parts.
 
Last edited:
I see the GTX-1060 series is still alive and well, but I want to step up to something that would not be a 'bottleneck for that processor.
I will be running SSD's on W10, nothing overclocked. I realize this is a tough question since there are so many models, many very similar to others.

I was originally looking a AMD chipset card, but I understand Nvidea has a advantage here. I do NOT "game", but I do video editing, but mostly towards to lower end (just above basic. So just how far up the ladder do I have to climb here, I'm not looking for cheap, but surely not anywhere near the higher end, just a noticeable improvement from the AMD FX-8350 AM3+ system I had.

I've been looking at a number of benchmark sites with that being mind boggling as it is. I hope this made some sense, if not, ask away.
For modern video editing an RTX 3060(8GB or 12GB) would be an excellent option. You don't really need anything more powerful. If you want to stay with something new, a 4060ti would also be excellent.

One of them would probably be my preference, but I am thoroughly red team.
The OP stated they are going with NVidia.
 
If you wanted to draw a parallel to your existing generation, the RTX 3050 (presuming it's the 8 GB version and not the slower 6 GB version) would be about as fast as an GTX 1070. Considering it's been coming up on a decade since release for the initial Pascal generation, that is way too small of an increase to make right now in my mind.

Unfortunately, you're not going to get much for $200 new right now so the budget might restrict you. I'm not sure if any RX 6600 non-XTs are still on the new market for that cheap. Maybe an Intel Arc is your best bet at that price point since your system supports rsizeable BAR on that CPU (which Arc needs or else it suffers a performance drop). Otherwise, the new graphics card market is terrible anymore. Your Pascal generation was the last time it was truly great. Now, mid-range cards are like $500 to $800 give or take (arguably higher as even the RTX 5080 is "mid range" by traditional norms if you look at it relative to the RTX 5090, but this is masked because no true higher end cards exist between the flagship and it, so the "upper mid-range" or "lower high-end" has simply vanished). The $250 to $450 point is now entry level GPUs that are called x600s and x60 Tis to appear a bit better than they are.

The used market might be better. That might get you an RXT 3060-ish (?) which is around GTX 1080 or Ti performance and closer to twice your existing GTX 1060 6 GB. That's the absolute lowest I would go for in an upgrade.

Also, gamers aren't the real reason the market is the way it is. The best selling SKU is typically nVidia's x50 or x60, sometimes the x70 if those lower two SKUs are bad in a generation (GTX 900 series, RTX 40 series). The enthusiasts buying x80s every other generation no matter the cost are the minority... and even they are a drop in the bucket compared to a new, more profitable market. It would take way too long to cover all the nuances describing why the market is the way it is, but there's two big standouts right now in my mind...

One is AI (and even that's not the whole story). AI chips is simply more profitable than consumer chips, and since "silicon production time" is finite, cutting the chips down and selling them for higher on the gaming side, while also using a bigger part of the production time for AI, is a win-win. This lesser amount of production devoted to consumer chips leads to less supply, and... we know what happens to prices when supply can't meet demand. The alternative is to use lesser nodes for consumer GPUs, but that leads into the second.

The second is that process node shrinks have slowed since ~2021 or so, thus advancement has slowed with it. When that happens, you need to throw more power/cores at things for a substantial performance gain, and even then they are smaller than otherwise.

Of course, companies wanting profits and customers bearing the cost is ultimately part of it.
 
How and where does memory on the card come into play here, isn't that what MB RAM is about?

I've been gathering GPU testing charts from a number of sites. Most give a 'number' with no reference to. One showed FPS which seems more meaningful (to me):
Passmark Software, 'Top CPU' (which I haven't heard of and I found meaningful), 'GPU Check' (using FPS at 3 different resolutions).

The OP stated they are going with NVidia.
But, that was based on what I have read previously. The majority of posters here and elsewhere seem to feel Nvidea was superior.

Of course, companies wanting profits and customers bearing the cost is ultimately part of it.
More like all of it. :respect:
I can raise my range to include $250, but past that I can't justify it.
 
How and where does memory on the card come into play here, isn't that what MB RAM is about?

I've been gathering GPU testing charts from a number of sites. Most give a 'number' with no reference to. One showed FPS which seems more meaningful (to me):
Passmark Software, 'Top CPU' (which I haven't heard of and I found meaningful), 'GPU Check' (using FPS at 3 different resolutions).
For what you're doing, 8GB would be fine, unless you intend to start editing at 4k+ resolutions, then higher end card with 12GB or 16GB might be in order. 4070 12GB or 4080 16GB would meet your needs better.
But, that was based on what I have read previously. The majority of posters here and elsewhere seem to feel Nvidea was superior.
And I would agree with that. For video editing, NVidia has the clear advantage.

I can raise my range to include $250, but past that I can't justify it.
Oh, well then yeah, a higher end card is not your option. Sorry about that, you posted this while I was typing a response. :laugh:
Given this info, maybe a used 2070 or 2080 would fit your budget, depending on where you live. Both of those would still be a step up on performance and would utilize the NVenc very well.
 
Last edited:
In new cards, $250 will get you an rtx 3050, rx6600, or a B750. Take your pick.
 
Rx 6600 id suggest
 
Any xx60 series (2060, 3060, 4060) or higher card from Nvidia, used (ideally) or new.

Nvidia usually has better driver support for creative workloads.

Search your local stores and list the options with their respective prices.
 
Used RTX A2000 if you can find one below $250. Slot power, full feature professional driver suite, error-correcting memory support, advanced color support. Unrestricted NVENC with unlimited concurrent encoding sessions. Excellent option for video editing, and can run basic games on the side. Got mine for around $200 3 months ago. You might have to look up, but you can probably grab one.

If sticking to strictly new, RTX 3050 8 GB is probably what you should get. It's a gaming card, but it'll do the trick for what you want to do. Avoid AMD GPUs - their encoder is junk unless you buy the RX 9070 XT.
 
NY State
AFA source, the 2 main sources as you mentioned, or any other reputable dealer/seller.
At the moment on ebay $233.50 ships from NJ. I see quite a few bids ongoing for 4060's.

I see Intel ARC 750 cards going for around $200 on ebay. Couldn't you just edit in one format using your older software and use something like HandBrake to encode it in the target format? I admit I don't do video editing so I have no idea. My thought is that might give you more video card options to choose from for transcoding if just the editing portion is ok with your current card.

(edit, some more thoughts...)
In that scenario not sure if that means you still need your existing card for editing purposes with your existing software (for compatibility) and the newer card simply to handle the transcoding. Your Gigabyte x570-Aorus (not sure which one you have) hopefully has 2 x16 slots otherwise I suppose a single GPU system might be the only way to go so you will have to choose wisely.
 
Last edited:
Back
Top