I think I am going to give Folding At Home a try. The Graphics Cards I currently have available are;
- RX 570 8G
- GT 1030 2G GDDR5
- GT 730 1G DDR3
- GT 710 1G DDR3
I also have an Athlon 200GE with an Integrated Radeon Vega 3 graphics processor. But I believe it is likely not going to contribute a worthwhile amount, with it only having 192 stream processors operating at 1000 MHz. I think the same can be said of the two GT 700 series cards. I would like to set up my GT 1030 for folding. Its small Pascal GP108 core should be able to help I think. It consumes so little power and all it does currently is idle, providing display outputs. The way I see it those CUDA cores might as well be doing some good too.
I apologise if my post is long, but I think I am thinking out loud here. Any input you can provide will be much appreciated though! I am going to highlight my main questions in bold to make it easier to read.
As I currently understand it, Folding at Home works better on Nvidia GPUs using CUDA than it does on AMD ones.
Is it worth configuring my RX 570 8G to run Folding when I am not gaming on it? The board has a power limit of 135W,
or is the performance per watt of the Polaris 20 GPU simply not feasible for long periods of time?
My plan is to put my GT 1030 into my Ryzen 3 1200 cruncher, and turn the CPU usage allowed by the BOINC client to 50%, so that the Folding Client can use the remaining cycles to feed the GT 1030. The dual 1700 rigs will remain with the 700 series parts and entirely dedicated for WCG.
I would like to invest into new hardware for folding at home. This may ultimately lead me to turn the 1200 PC entirely to F@H, and purchase a motherboard with two mechanically 16X PCIE slots so that I may use multiple GPUs in this machine for Folding. On that note;
I assume folding doesn't put a lot of strain on the PCIE interface, is this correct? So with this in mind, would a board like this one be suitable for running two GPUs for F@H? The second slot is connected via the Promontory chipset, using gen2 connectivity.
I think I may investigate the Turing-based GTX 1660 non-ti card that was recently spotted, for this PC.
Thanks for reading. I would love to hear everyone's opinions and thoughts on my plans, thank you!
Ash