• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

My not so ultimate 2025H1 GPU purchase advice

Status
Not open for further replies.
Joined
Feb 24, 2023
Messages
4,165 (4.95/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
So, dust settles and all GPUs supposed to matter had been released. This guide is supposed to help gamers more than anyone else. Professional users already know AMD don't do it right, at least supposed to be aware of that. Let's go.

DISCLAIMER:
The author of this post is an active AMD GPU user that knows what he is talking about when criticising AMD products.
If you feel like I'm pro-nVidia or anti-AMD you're wrong when you think the former and right when you think the latter.

ADVICE No. 0: DON'T BUY A GPU UNLESS ABSOLUTELY NEED TO. BOTH AMD AND NVIDIA DESERVE A MASSIVE BOYCOTT.

General thoughts:


I'm starting from the very bottom. RX 9060 series VS RTX 5060 series contest.
Both these series are one of the bollocks of all times. Almost negative improvements compared to previous gen clown fiesta aka RTX 4060 Ti. A 16-gig version of 5060 Ti might've been a decent option but at its burglary of a price it's a straight up no. Ignore 8-gig models, they won't last. 9060 XT 16 GB is technically a reasonable value product but if we take it more seriously it doesn't shift a gear. It loses to 5060 Ti in game stability and upscaling support. The latter is devastating because at this performance level and at this game system requirements, upscaling is unfortunately a must. And FSR4 is far from being everywhere.
A used 4070 could be a healthy alternative to either of them. If you can't trust your sellers or if the prices are out of whack on this one... Well. Whatever xx60 GPU of this gen you pick it will be an overpriced piece of rubbish. Don't come and say you haven't been warned.

Mid-range. RX nada VS RTX 5070 VS used 4070 Super/Ti.
If you care for 32-bit CUDA stuff the answer is obvious for you. If you care for value then pick whatever is cheaper. RDNA3 GPUs are irrelevant at this point. 9070 non-XT might crawl into this price range and if it does and if you're dead sure its half-baked state doesn't scare you go for it. I wouldn't do that because who knows how long it'll take AMD to fix game crashing (which is rare tbf), poor optimisation (not so rare but still rare) and missing native FSR4 support (vast majority of games).

Upper mid-range. RX 9070 series VS 5070 Ti VS 4070 Ti Super VS 4080 series.
Same again, if you care for 32-bit CUDA then obviously it's either 4070 Ti Super or 4080, depending on how much you're ready to invest. If you don't... 9070 XT offers about as much raw performance as 4070 Ti Super does, sometimes better, sometimes worse. But it offers no CUDA and native FSR4 is almost non-existent and who knows when AMD will fix this. Want to risk it, go ahead but let's say I warned you. With 5070 Ti... not a lot is right with it, it's a seriously overpriced GPU that has its own problems but if 32-bit CUDA support isn't a concern it's a current best buy. I hate it but what can I do.

Enthusiast range. Only nVidia cards populate this area. 4090 if you care for 32-bit CUDA and/or having "only" 16 GB VRAM is inconvenient for you. 5080 otherwise. 5090... if you can afford it why are you reading it?

VRAM concerns:


8 GB VRAM is a no go in 2025. At least if you're paying more than two C-notes for a GPU. There is a lot YouTube materials showcasing how problematic this VRAM size is.
12 GB VRAM is passable. Nothing burger, I wouldn't say it's enough for exceptional gaming experience whatsoever. But it's enough for reasonable gameplay. Will do if you don't do path tracing and ain't gonna crank everything up to the max. 4K montior users should probably avoid GPUs this short on VRAM.
16 GB VRAM is totally fine for now. There's maybe a couple games where you can meaningfully use more but these are complete outliers and they mostly feel worse on AMD cards as these manage VRAM shortages a little worse than nVidia counterparts. See Marvel Spider-Man 2 at max settings for example. 16-gigabyte 9070 XT runs out of VRAM but at the same time, 16-gigabyte RTX 5070 Ti feels perfectly fine. There might be other cases of such thing.
20+ GB VRAM is an overkill for actual gaming. If you get a GPU this rich in VRAM you're most likely doomed to run out of calculating power before running out of VRAM.

RDNA2/3 and Ampere GPUs


These, frankly, are best avoided. RDNA3 cards might pack some punch in pure raster but they don't support FSR4, they lack RT performance, they fall off in UE5 games and things will only get worse for them. FSR3 upscaling is far from acceptable (at least to my eyes) and it will never evolve to be something good. RDNA2 is just a huge nope, unless you paid like two hamburgers for a card. Ampere GPUs are much better value but they have very limited VRAM (10 GB for 3080, for example, doesn't feel right; and 8 GB 3060 Ti alongside 3070 series is just THE insult). If you are completely sure in your actions and don't fear wattage spikes (these are horrible in both Ampere and RDNA2 cards) and also don't mind the cards' age and mining status then a 3080 series GPU might be a very good purchase. Check if your PSU can handle that before pulling the trigger anyway.

Older GPUs are generally only okay if bought for a crumb of lunch money and slotted in a system you don't really care about. But 2080 Ti is a healthy alternative to 5060 and 9060 cards, provided it's cheaper, the seller is reputable, and your PSU can handle it.


What about Intel?


Intel dGPUs are a good addition to the mix as a concept but their actual performance is limited, and you're mostly investing in a cat in a bag buying such GPUs. Absolutely avoid them if you don't have a CPU with outstanding single thread performance (such as i5-13600K or Ryzen 5 7600, or better).

What about PCI-e lane count?

x8 cards will be fine if your system supports PCI-e 4.0 or higher. On 3.0 systems, you should expect significantly worse performance in cases when VRAM is fully utilised. It's a major issue for 8-gigabyte 5060 cards. Once again, if your system can go 4.0 then you don't have to worry about it. x16 GPUs are generally fine on whatever supports 3.0.

What about power efficiency?


The difference is minimal if we talk RDNA4 VS Ada VS Blackwell. Pick your poison.

What about 12VHPWR?


I hate it and I don't recommend using it at a full throttle. Do yourself a favour and adhere to 120 W per 8-pin wattage formula. Running a 5090 at 575 W isn't safe. 480 W must be fine. Make sure nothing bends and everything stays put.


What about frame generation and shiet?


No comment. I can't be objective on this matter.
 
Well not buying any gpu isn't ultra realistic. If you are stuck with something low range that is >5 years old (so basically a 5700 / 5700xt / 2060 / 2070) then the 9060xt is okayish. Anything newer than that and it's a very meh proposition.
 
A pretty good guide! Though I should note you might have mistyped 32-bit PhysX as 32-bit CUDA. The 32-bit PhysX is no longer supported on the Blackwell cards, but CUDA definitely is. CUDA is a must for some AI and a few productivity workflows, but it doesn't matter for gaming.

A decent Ryzen 5000 series (which almost any AM4 mobo owner can upgrade to) should be able to run the Intel Arc B580 without any major issues. The performance degradation would be very small. If anyone does have a Ryzen 3000, 2000 or even 1000 series CPU, then I'd say a CPU upgrade is definitely in order alongside any contemporary graphics card.
 
Though I should note you might have mistyped 32-bit PhysX as 32-bit CUDA
PhysX is a part of CUDA. They removed all 32-bit CUDA instructions, including PhysX.
If anyone does have a Ryzen 3000, 2000 or even 1000 series CPU, then I'd say a CPU upgrade is definitely in order alongside any contemporary graphics card.
While true, it's way worse on Intel GPUs than it is on green or red ones. 3700X, for example, is fine with a 4060 Ti 16 GB (my acquaintance has a PC like this), whereas with B580, the performance would've tanked obscenely. Anyway, that guy is saving up for a better PC, he wants to play at 4K...
 
PhysX is a part of CUDA. They removed all 32-bit CUDA instructions, including PhysX.

While true, it's way worse on Intel GPUs than it is on green or red ones. 3700X, for example, is fine with a 4060 Ti 16 GB (my acquaintance has a PC like this), whereas with B580, the performance would've tanked obscenely. Anyway, that guy is saving up for a better PC, he wants to play at 4K...

Ah, I see. You are correct about CUDA. It's also true that lower-end nVidia or AMD cards are a better match for old processors than the Intel release. However, I'd still advise anyone still using an old processor to consider something like a 5700X3D - should be compatible with the same motherboard and RAM with just a BIOS update and they only cost $150-200.
 
However, I'd still advise anyone still using an old processor to consider something like a 5700X3D
Depends on the tasks and on the market. In some countries X3D chips are too expensive and some people need more multi-threaded CPU performance rather than gaming performance so they'd buy a 5950X instead.
 
So, dust settles and all GPUs supposed to matter had been released. This guide is supposed to help gamers more than anyone else. Professional users already know AMD don't do it right, at least supposed to be aware of that. Let's go.

DISCLAIMER:
The author of this post is an active AMD GPU user that knows what he is talking about when criticising AMD products.
If you feel like I'm pro-nVidia or anti-AMD you're wrong when you think the former and right when you think the latter.

ADVICE No. 0: DON'T BUY A GPU UNLESS ABSOLUTELY NEED TO. BOTH AMD AND NVIDIA DESERVE A MASSIVE BOYCOTT.

General thoughts:

I'm starting from the very bottom. RX 9060 series VS RTX 5060 series contest.
Both these series are one of the bollocks of all times. Almost negative improvements compared to previous gen clown fiesta aka RTX 4060 Ti. A 16-gig version of 5060 Ti might've been a decent option but at its burglary of a price it's a straight up no. Ignore 8-gig models, they won't last. 9060 XT 16 GB is technically a reasonable value product but if we take it more seriously it doesn't shift a gear. It loses to 5060 Ti in game stability and upscaling support. The latter is devastating because at this performance level and at this game system requirements, upscaling is unfortunately a must. And FSR4 is far from being everywhere.
A used 4070 could be a healthy alternative to either of them. If you can't trust your sellers or if the prices are out of whack on this one... Well. Whatever xx60 GPU of this gen you pick it will be an overpriced piece of rubbish. Don't come and say you haven't been warned.

Mid-range. RX nada VS RTX 5070 VS used 4070 Super/Ti.
If you care for 32-bit CUDA stuff the answer is obvious for you. If you care for value then pick whatever is cheaper. RDNA3 GPUs are irrelevant at this point. 9070 non-XT might crawl into this price range and if it does and if you're dead sure its half-baked state doesn't scare you go for it. I wouldn't do that because who knows how long it'll take AMD to fix game crashing (which is rare tbf), poor optimisation (not so rare but still rare) and missing native FSR4 support (vast majority of games).

Upper mid-range. RX 9070 series VS 5070 Ti VS 4070 Ti Super VS 4080 series.
Same again, if you care for 32-bit CUDA then obviously it's either 4070 Ti Super or 4080, depending on how much you're ready to invest. If you don't... 9070 XT offers about as much raw performance as 4070 Ti Super does, sometimes better, sometimes worse. But it offers no CUDA and native FSR4 is almost non-existent and who knows when AMD will fix this. Want to risk it, go ahead but let's say I warned you. With 5070 Ti... not a lot is right with it, it's a seriously overpriced GPU that has its own problems but if 32-bit CUDA support isn't a concern it's a current best buy. I hate it but what can I do.

Enthusiast range. Only nVidia cards populate this area. 4090 if you care for 32-bit CUDA and/or having "only" 16 GB VRAM is inconvenient for you. 5080 otherwise. 5090... if you can afford it why are you reading it?

VRAM concerns:

8 GB VRAM is a no go in 2025. At least if you're paying more than two C-notes for a GPU. There is a lot YouTube materials showcasing how problematic this VRAM size is.
12 GB VRAM is passable. Nothing burger, I wouldn't say it's enough for exceptional gaming experience whatsoever. But it's enough for reasonable gameplay. Will do if you don't do path tracing and ain't gonna crank everything up to the max. 4K montior users should probably avoid GPUs this short on VRAM.
16 GB VRAM is totally fine for now. There's maybe a couple games where you can meaningfully use more but these are complete outliers and they mostly feel worse on AMD cards as these manage VRAM shortages a little worse than nVidia counterparts. See Marvel Spider-Man 2 at max settings for example. 16-gigabyte 9070 XT runs out of VRAM but at the same time, 16-gigabyte RTX 5070 Ti feels perfectly fine. There might be other cases of such thing.
20+ GB VRAM is an overkill for actual gaming. If you get a GPU this rich in VRAM you're most likely doomed to run out of calculating power before running out of VRAM.

RDNA2/3 and Ampere GPUs

These, frankly, are best avoided. RDNA3 cards might pack some punch in pure raster but they don't support FSR4, they lack RT performance, they fall off in UE5 games and things will only get worse for them. FSR3 upscaling is far from acceptable (at least to my eyes) and it will never evolve to be something good. RDNA2 is just a huge nope, unless you paid like two hamburgers for a card. Ampere GPUs are much better value but they have very limited VRAM (10 GB for 3080, for example, doesn't feel right; and 8 GB 3060 Ti alongside 3070 series is just THE insult). If you are completely sure in your actions and don't fear wattage spikes (these are horrible in both Ampere and RDNA2 cards) and also don't mind the cards' age and mining status then a 3080 series GPU might be a very good purchase. Check if your PSU can handle that before pulling the trigger anyway.

Older GPUs are generally only okay if bought for a crumb of lunch money and slotted in a system you don't really care about. But 2080 Ti is a healthy alternative to 5060 and 9060 cards, provided it's cheaper, the seller is reputable, and your PSU can handle it.


What about Intel?

Intel dGPUs are a good addition to the mix as a concept but their actual performance is limited, and you're mostly investing in a cat in a bag buying such GPUs. Absolutely avoid them if you don't have a CPU with outstanding single thread performance (such as i5-13600K or Ryzen 5 7600, or better).

What about PCI-e lane count?

x8 cards will be fine if your system supports PCI-e 4.0 or higher. On 3.0 systems, you should expect significantly worse performance in cases when VRAM is fully utilised. It's a major issue for 8-gigabyte 5060 cards. Once again, if your system can go 4.0 then you don't have to worry about it. x16 GPUs are generally fine on whatever supports 3.0.

What about power efficiency?

The difference is minimal if we talk RDNA4 VS Ada VS Blackwell. Pick your poison.

What about 12VHPWR?

I hate it and I don't recommend using it at a full throttle. Do yourself a favour and adhere to 120 W per 8-pin wattage formula. Running a 5090 at 575 W isn't safe. 480 W must be fine. Make sure nothing bends and everything stays put.

What about frame generation and shiet?

No comment. I can't be objective on this matter.
So, let me get this straight, this is a so called "guide" that instead of guiding you to a wise choice, tells you at the start that ALL GPUs from AMD and Nvidia should be boycotted, and Intel's GPUs aren't worth buying?

That pretty much makes reading all the rest pointless, especially when the disclaimer is in massive caps! :rolleyes:

I mean, I get it, the GPU market is a real crap show lately, especially with AMD bowing out of the high end of it. I just don't see that threads like this are helping matters on that front.
 
to make it short
the only viable brand new lower mid range GPU is rx 9060xt 16gb (but only if you can get it at (or very close to) its MSRP price)
the only viable upper mid range card is rtx 5070 (at least in my country) because it can be found for 600€ , whereas even the cheapest model of 9070 costs 80€ more at the moment .
the only viable high end card is ... surprise ... rtx 5090 ...

when it comes to second hand market :
2080Ti is a good alternative to rtx 5060/9060 8gb
3080 12gb/3080Ti/4070 are good alternatives to 5060Ti
i would personally avoid all older AMD cards unless you can get it dirt cheap ...
 
So, let me get this straight, this is a so called "guide" that instead of guiding you to a wise choice, tells you at the start that ALL GPUs from AMD and Nvidia should be boycotted, and Intel's GPUs aren't worth buying?
Wrong in the sense it's not the only thing it does. That longread going thereafter is an actual guide. I pretty much explained what to do if you actually need or at least want a new GPU bad so I covered quite a lot of possible scenarios.
That pretty much makes reading all the rest pointless, especially when the disclaimer is in massive caps!
This disclaimer is built on my experience with RTX 3060 and RX 6700 XT. The latter surely has a bunch more calculating power for pure raster stuff but the difference is nigh negligible in the modern UE5 titles, DLSS4 is so far ahead you can easily compare DLSS TM Performance to FSR2/3 Native and see the latter losing in 19 scenarios outta 20 (at 1440p). Which means 3060 is effectively FASTER than the more expensive 6700 XT despite being a fairly underloved GPU built by so-called nGreedia on an inferior node. That's why I'm anti-AMD. They make false claims and unreliable products (not in the sense they break or something but in the sense you can't expect them to age better than milk despite some killer feature like significantly better VRAM amount compared to nVidia price-sakes being present), whereas even older nVidia GPUs get some love over time (RTX 2000 series still gets feature updates despite being older than RDNA2/3 which are effectively going to be abandonware).

I would've loved to hold AMD in high regards. I would've loved for them to succeed. I would've loved for the market to become full with products actively trying to beat each other. But no, there's unfortunately none of that. They wronged us.
 
Last edited by a moderator:
This disclaimer is built on my experience with RTX 3060 and RX 6700 XT.
That is not NEARLY enough to warrant saying "both AMD and Nvidia deserve a massive boycott", and as I also said it somewhat discredits the "guide". It also reduces the entire thread to a mere dime a dozen rant post over just 2 GPUs you had a bad experience with. Boo hoo for you. :rolleyes:
 
Big yikes on this opinion piece (buying guide is not the words I'd use).


8GB a 'no go' in 2025???
Avoid Intel GPUs???
Ampere in 2025???
12VHPWR???
Can't bring yourself to discuss MFG???
This 'thinking' is based on 3000 and 6000 series NV/AMD cards???

With the utmost respect for time it took to post this...Thanks, but, no thanks.
 
8GB a 'no go' in 2025???
If you want to pay serious amounts of real money and trap yourself in lowering settings in everything AAA since like 2024 be my guest. Mind me, 16 GB cards with the exact same calculating power can run such things fine or sometimes even great. I wouldn't have any trouble paying 150 for an 8 GB card but 300+ is you can rip someone else off.
Avoid Intel GPUs???
I explicitly stated it's better avoidable if you have a CPU with relatively weak single thread performance. Drivers are still not optimised for multi-threaded usage. If you have a stupid fast CPU then Intel dGPUs deliver decent value, provided bought at MSRP.
Ampere in 2025???
As a healthier alternative to RDNA2/3, thanks to full DLSS upscaling support and better UE5 handling. Used 3080 series GPUs sell for dirt cheap and this is convincing for people having a difficult time. Surely these GPUs are far from ideal but it's justified by their price.
12VHPWR???
Name me something that should never have existed to a larger extent than this nonsensical connector. Its design is pure, distilled foul and even if it was engineered correctly I'm still not convinced we had any matter requiring immediate dedication in the first place.
Can't bring yourself to discuss MFG???
I never played with MFG. Youtube reviews of MFG isn't the most reliable source of information. Forgive me if my unwillingness to spout hot bollocks offended you.
Standard frame generation didn't convince me, either, however I never played with FG on for longer than a dozen minutes so my opinion on frame generation DOES NOT YET EXIST.
 
If you want to pay serious amounts of real money and trap yourself in lowering settings in everything AAA since like 2024 be my guest. Mind me, 16 GB cards with the exact same calculating power can run such things fine or sometimes even great.
Prices today are what they are. They are poor, you're not wrong. To me, your opinion feels like it's rooted in there being lower priced cards out. But... for the moment, these are bottom barrel.

Also, there's an awfully large contingent of budget esports players that would beg to differ. Certainly if one can afford it, get the better card (whatever that means)... but to say unequivocally that 8GB is a no-go is patently false, especially if you look at the cross section of 1080p gamers and the titles they play (according to steam 1080p is still the most widely used res, and esports games, that don't stress the GPU vRAM as much are also dominant). I don't buy GPUs for the exceptions to the rule, but it should def. be a consideration.

I explicitly stated it's better avoidable if you have a CPU with relatively weak single thread performance. Drivers are still not optimised for multi-threaded usage. If you have a stupid fast CPU then Intel dGPUs deliver decent value, provided bought at MSRP.
You did mention that, after this gem...
Intel dGPUs are a good addition to the mix as a concept but their actual performance is limited, and you're mostly investing in a cat in a bag buying such GPUs.
To me, logic dictates that I need a fast CPU for low res (1080p) gaming in the first place. Ryzen 2000 series and Intel 9th gen (7+ and almost 7 years old now) are long in the tooth and put a glass ceiling on most any GPU. They also stated it would be patched... it's been ~2 months since, so, eventually, that will be a non-issue.
Name me something that should never have existed to a larger extent than this nonsensical connector. Its design is pure, distilled foul and even if it was engineered correctly I'm still not convinced we had any matter requiring immediate dedication in the first place.
We'll have to agree to disagree about the weight of the six confirmed(?) burnt connectors (you;d think with how big it blew up that if it was still a problem we'd see more of it... we don't). It's a poor design, but, if you're using a proper (read: 3.1/5.1) PSU and you plug it in right, there's little chance of failure (leans over, looks at 600W+ setup that's been fire-free for months now... smiles).

I mean, I like the attempt, but it's just filled with too much bias/jaded opinion. What kind of guide eschews discussing important factors because 'they can't be objective'??? Again, this falls somewhere between an editorial, hit piece on the state of video cards, and a rant. Formulating an opinion and basing a guide off 2 gen-old cards and some scraped together videos made for clicks just isn't where it's at for me. :)
 
@EarthDog, please come back when you re-read my post CAREFULLY. Nothing you imply is correct. I could elaborate but I see too much asinine hostility and don't see a point.
 
Also I might add that games these days have also gotten expensive with less discount on some titles even after a year or two after its initial release. So your game library titles should also be considered when purchasing a gpu… would a shiny new pixel generator improve my current game library experience?
 
Name me something that should never have existed to a larger extent than this nonsensical connector.
AMD Bulldozer and the rest of the construction line.
 
Operating system should also matter.

In early 2023 I still had to recompile the nvidia binary graphic drivers with the linux kernel on every kernel update. Still the same as the many years before.

AMD Radeon works with the linux kernel regardless if it's the ryzen 7600x graphics only, or 7600x and radeon 6800 non xt or 7600x and radeon 7800xt without recompiling. No changes needed for any configuration file with openrc init and the linux kernel and mesa

Intel should fix their wifi drivers first before I even dare to test in any future any intel based graphic card or other hardware again.
 
A buyers guide would list pricing and the relevant performance and feature sets.

Would also be helpful, if you actually had most of the cards youve mentioned.

For example, I can give you relative performance numbers off the following cards based on ownership and benchmarking. Where I can keep a very strong non-bias opinion for the hardware.

5070
4070
4060
3060
3050
7900 XT
6800
6700
6600
7600
6650
6400

That would include ti and super versions and XT and non XT version also including some mGPU with a few AMD cards, but not with NV cards. The list is definitely longer, thats just off my head, last 24 months or so.

What is common for all the cards above, is they are all capable of playing the same games, just at different resolutions and game settings. This information would be nice to include in your buyers guide.

Hopefully in the future we can see a true non bias buyers guide. I could probably write one up with data and prices I paid at the time, but pricing shifts with supply and demand. There's no guide that will last more than 6 months Im afraid. It would need updates often. We are half way to 2026 already!!
 
It seems like you'd only recommend a used 4070 / Ti Super. But that has a 12VHPWR which you don't recommend. Used 4070 / Ti Super cards aren't even any cheaper than their brand new 9060 XT / 5060 Ti / 5070 / 9070 counterparts.

I agree with some recommendations like avoiding RDNA 2/3 (especially if you're buying midrange and can afford the new generation) and avoiding 8GB if you plan on playing current and future AAA games above console settings without problems.

But overall this guide makes every option sound bad.
 
It seems like you'd only recommend a used 4070 / Ti Super. But that has a 12VHPWR which you don't recommend. Used 4070 / Ti Super cards aren't even any cheaper than their brand new 9060 XT / 5060 Ti / 5070 / 9070 counterparts.

I agree with some recommendations like avoiding RDNA 2/3 (especially if you're buying midrange and can afford the new generation) and avoiding 8GB if you plan on playing current and future AAA games above console settings without problems.

But overall this guide makes every option sound bad.
you are not wrong... performance has been stale since the 50 series release and the jacked up price made it worse truly 2025 is a terrible time to purchse a gpu
 
But overall this guide makes every option sound bad.

Agreed here, the thread would be alot more helpful with data, pricing and availability. At this stage though its kind of moot since older cards are no longer manufactured and there are no reliable sources for them except for second hand. Also I wouldn't recommend the older series to anyone unless its for a bargain fire sale price.

There are still cards available in every single price point and are still somewhat better than their counterparts. MFG for example would still be great on the 5060/5060ti's allowing for x4 the frames where needed, I'd rather play a triple A title at 144hz with 30hz input lag than 30 frames altogether for example, only the newest cards can do this..

12VHPWR isn't a big deal at all especially on a 5080 and below since they will never reach the 600w capability of the cable.
 
A buyers guide would list pricing and the relevant performance and feature sets.
But you'd at best only be listing price for each one at a specific time and place. At some point you have to come to the realization that Nvidia's extremely limited supply and lack of price enforcement on entry level high end and up GPUs has made enthusiast GPU purchasing as unstable a market as cryptocurrency! It's a scalper's market anymore, and it has been for some time.

It seems like you'd only recommend a used 4070 / Ti Super. But that has a 12VHPWR which you don't recommend. Used 4070 / Ti Super cards aren't even any cheaper than their brand new 9060 XT / 5060 Ti / 5070 / 9070 counterparts.

I agree with some recommendations like avoiding RDNA 2/3 (especially if you're buying midrange and can afford the new generation) and avoiding 8GB if you plan on playing current and future AAA games above console settings without problems.

But overall this guide makes every option sound bad.
Especially when he holistically makes 12VHPWR sound like a bad design. I went with a fairly high end (Seasonic 1200 Vertex Gold) PSU that I got for a little more than $220, and it's 12VHPWR cable is individually wired at the plug so it's plenty flexible enough to avoid bending pins.

The problem with most of this "guide" is it is written mostly from a worst case scenario perspective on each point, which I find is common when someone is venting over bad product experiences, and basing it on just 2 GPUs is not nearly a wide enough base of knowledge to work from.
 
Last edited:
This guide reads more like an opinion piece, rather than an unbiased look at a snapshot in time of currently available GPU options and their pros/cons.
 
Especially when he holistically makes 12VHPWR sound like a bad design. I went with a fairly high end (Seasonic 1200 Vertex Gold) PSU that I got for a little more than $220, and it's 12VHPWR cable is individually wired at the plug so it's plenty flexible enough to avoid bending pins.
Yeah, I'm not really sold on the 12vhpwr being bad by default. Otherwise we'd have a much wider scale of reports, including from ASRock and Sapphire who both use 12vhpwr on their top-end 9070 models. But there's been crickets...

NVIDIA seems to have issues with load balancing on their high end, but the issue hasn't exactly scaled down proportionality to other cards.
 
Last edited:
@EarthDog, please come back when you re-read my post CAREFULLY. Nothing you imply is correct. I could elaborate but I see too much asinine hostility and don't see a point.
I don't intend to be hostile. Sorry about that part. But I speak the truth, as do multiple others in this thread.
So, let me get this straight, this is a so called "guide" that instead of guiding you to a wise choice, tells you at the start that ALL GPUs from AMD and Nvidia should be boycotted, and Intel's GPUs aren't worth buying?

That pretty much makes reading all the rest pointless, especially when the disclaimer is in massive caps! :rolleyes:

I mean, I get it, the GPU market is a real crap show lately, especially with AMD bowing out of the high end of it. I just don't see that threads like this are helping matters on that front.
It also reduces the entire thread to a mere dime a dozen rant post over just 2 GPUs you had a bad experience with. Boo hoo for you. :rolleyes:
Hopefully in the future we can see a true non bias buyers guide.
It seems like you'd only recommend a used 4070 / Ti Super. But that has a 12VHPWR which you don't recommend.
But overall this guide makes every option sound bad.
Especially when he holistically makes 12VHPWR sound like a bad design. I went with a fairly high end (Seasonic 1200 Vertex Gold) PSU that I got for a little more than $220, and it's 12VHPWR cable is individually wired at the plug so it's plenty flexible enough to avoid bending pins.

The problem with most of this "guide" is it is written mostly from a worst case scenario perspective on each point, which I find is common when someone is venting over bad product experiences, and basing it on just 2 GPUs is not nearly a wide enough base of knowledge to work from.
Agreed here, the thread would be alot more helpful with data, pricing and availability. At this stage though its kind of moot since older cards are no longer manufactured and there are no reliable sources for them except for second hand. Also I wouldn't recommend the older series to anyone unless its for a bargain fire sale price.
This guide reads more like an opinion piece, rather than an unbiased look at a snapshot in time of currently available GPU options and their pros/cons.
Yeah, I'm not really sold on the 12vhpwr being bad by default. Otherwise we'd have a much wider scale of reports, including from ASRock and Sapphire who both use 12vhpwr on their top-end 9070 models. But there's been crickets...

NVIDIA seems to have issues with load balancing on their high end, but the issue hasn't exactly scaled down proportionality to other cards.
 
Last edited:
Status
Not open for further replies.
Back
Top