- Joined
- May 8, 2016
- Messages
- 1,962 (0.61/day)
System Name | BOX |
---|---|
Processor | Core i7 6950X @ 4,26GHz (1,28V) |
Motherboard | X99 SOC Champion (BIOS F23c + bifurcation mod) |
Cooling | Thermalright Venomous-X + 2x Delta 38mm PWM (Push-Pull) |
Memory | Patriot Viper Steel 4000MHz CL16 4x8GB (@3240MHz CL12.12.12.24 CR2T @ 1,48V) |
Video Card(s) | Titan V (~1650MHz @ 0.77V, HBM2 1GHz, Forced P2 state [OFF]) |
Storage | WD SN850X 2TB + Samsung EVO 2TB (SATA) + Seagate Exos X20 20TB (4Kn mode) |
Display(s) | LG 27GP950-B |
Case | Fractal Design Meshify 2 XL |
Audio Device(s) | Motu M4 (audio interface) + ATH-A900Z + Behringer C-1 |
Power Supply | Seasonic X-760 (760W) |
Mouse | Logitech RX-250 |
Keyboard | HP KB-9970 |
Software | Windows 10 Pro x64 |
Due to "normalized" minimum 200-250W TDP on new cards (or Titan X(p) level), this isn't that big issue... for now. Performance of RTX-30/RDNA2 is "OK" vs. modern cards in that power range (3060(Ti)/67x0/66x0 vs. 4060/7600/A770/B580[?]).No. An old x90 card still consumes x90 power, not to mention the lack of warranty and the abuse it's probably been through by miners and overclockers.
Abuse by miners/overclockers doesn't negate absolute INSANE number of cards that were released on free market during mining boom. If miner adjusted vGPU and keep it cool (which meant higher efficiency, so they profited by doing it) - there shouldn't be much issues on longevity vs. "usual" use case I think.
Last edited: