Mining GPUs are run at a constant temperature, rarely power or temperature cycled, and run far below maximum power. Not to mention the person running the mining card has alot more reason to keep that card in good condition than the average gamer. I would choose an ex-minig card, over a general use card every time without question.
I see this awful take thrown around a lot, but this is just misinformation thrown around by people who actually own "mining equipment" and is looking to dump these in the hands of unsuspecting gamers at the earliest signs of mild hardware failure, so I suppose it's high time I counter it myself:
1. You cannot seriously believe that every miner is running a well-conditioned environment with AC on 24/7 (aka even higher power bills, especially considering the high heat output of many GPUs working together), this statement is objectively false and almost never the case: GPUs can take heat over time and electricity costs to keep a cryptocurrency farm air-conditioned far exceed the costs of replacing failing hardware itself; this is
especially true when you are talking about small-time mining operations where one of those miserable crypto bros bought himself a rig with 12 RTX 3080s and thinks he's buying his way into the metaverse;
2. You cannot seriously believe that every miner is running a heavily optimized power curve;
especially in large-scale operations as each individual graphics card's ASIC has its own power characteristics and it is simply impractical to develop custom BIOSes for each and every of them, especially when you are running 1000+ GPUs... Miners do not run Windows, which makes it doubly-difficult to manually test and batch assign each and every GPU to its optimal power curve. Additionally, power cycling the hardware not only is harmless but also occurs eventually even in high-uptime scenarios, servers must restart and receive maintenance periodically which is why failover clustering and backup systems exist;
3. Mining is an
exceptionally intensive workload, placing a very high burden on the graphics card's power delivery system and - due to the aforementioned reasons -
WILL cause heat damage to the hardware - even if running at an acceptable temperature, you're talking about 100,000+ hours at a constant 80°C+ heat load - goodbye caps, goodbye MOSFETs... memory ICs are particularly sensitive to the heat damage, if you get your hands on a known mined GPU and place it side by side with a new one you will see that there is clear heat damage surrounding the memory area, usually yellowed and sometimes, the plastic packaging surrounding the memory die itself will feel warped... in fact, the GPU ASIC itself is usually the only thing that survives and these have been finding their way into hardware recycling factories in China, resold through unlicensed OEMs that aren't listed in AMD's or NVIDIA's board partner list to unsuspecting customers as
new GPUs.
A high profile example recently concerns Afox's RX 580s that have been dumped on third world markets, they are visibly recycled Polaris ASICs with *visible* heat damage (discolored substrate, yellowed epoxy) that are just removed from dead GPUs and reinstalled onto a new PCB. There was a huge commotion over it recently as Latin America's largest hardware channel got wind of it and exposed for everyone to see, at around 5:45 there is an example of what I mean:
It's akin to "okay I am going to buy this mildly used quote unquote high endurance MLC enterprise SSD! They will last a lifetime!" except they've already had PBs of data run through them and their SMART is almost about to trigger, but since it will end up on a Windows system that won't complain until the bitter end, it'll go unnoticed until the final outcome occurs... you know it, I know it, the scalpers know it and the miners know it. It's just average joe gamer that is lured in by the prospect of buying a GPU that has had 95% of its useful life run through at an "acceptable" cost - this is miners double dipping and laughing all the way home after dropping you this big fat çłž.
I could come up with more reasons for one not to buy a mined GPU but it would come down to the single old adage: a deal that is too good to be true most likely is; and that a fool and his money are soon parted.