- Joined
- Sep 17, 2014
- Messages
- 22,358 (6.03/day)
- Location
- The Washing Machine
Processor | 7800X3D |
---|---|
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | Thermalright Peerless Assassin |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
I'm sure that alot of GPU's tend to last out 5 years as well on stock without any undervolt and even with a dusty heatsink and fan combination. But who's going to play with a GPU that's 5 years old or even older? Ill buy products now, use 'm, and replace 'm, just as a car, just as my kitchen, just as whatever thats designed to be replaceable. If that was'nt the case anyone of you would still be running their Pentium 1's and AMD K5's with their Voodoo chips around it.
As for your fancy heat story, VRM's are designed to withstand 110 degrees operating temperature. It's not really the VRM's that suffer but more things like the capacitors sitting right next to it. They have a estimated lifespan based on thermals. The hotter the shorter their mbtf basicly is. I woud'nt recommend playing on a card with a 100 degree vrm where GDDR chips are right next to it either, but it works and there are cards that last out many many years before giving their last frame ever.
It's becomes just more and more difficult, to cool a small die area. It's why Intel and AMD are using IHS. Not just to protect it from being crushed by too tense heatsinks or waterblock but to more evenly distribute the heat. That's why all the GPU's these days come with a copper baseplate that extracts heat from the chip faster then a material like aluminium does. AMD is able to release a stock videocard with a great cooler, but what's the purpose of that if the chip is designed to run in the 80 degree mark? The fan ramps up anyway if that is the case. And you can set that up in driver settings as well. Big deal.
The only conclusion then is: time will tell
I'm staying far away, regardless.
My GTX 1080 is now running into 3 years post-release and I can easily see myself getting another year out of it. And after that, I will probably sell it for close to 100-150 EUR because it still works perfectly fine. If you buy high end cards, 3 years is short and a great moment to start thinking about an upgrade WITH a profitable sale of the old GPU.
You can compare resale value of Nvidia vs AMD cards over the last five to seven years and you'll understand my point. Its almost an Apple vs Android comparison, AMD cards lose value much faster and this is the reason they do. Its too easy to chalk that up to 'branding' alone.
Last edited: