Buy a server GPU if you're going to be running server tasks.
I don't have to, most of the generation runs fine on average cards.
But i do have a few tesla and radeon pro cards.
More complex things, more the vram needed.
Without GPU you can use GPT but not a very complex one:
GPT4All
Generate pictures, stable diffusion offline needs GPU (without GPU is not efficient, using all my overclocked 18 core i9 and eats up 400w)
Portable version.
![RtAnnCQxaVJNYgA4LbBhuJ.png RtAnnCQxaVJNYgA4LbBhuJ.png](https://tpucdn.com/forums/data/attachments/323/323689-f97e2fc8a03d5d9a8bd75d32da5b1110.jpg)
It is fast enough to generate 512x512 pictures, but with more vram it can generate larger more complex pictures.
The most important is the vram size and speed, average nvidia cards are faster. (optimized for cuda)
I got 4-6 images/minute with a cheap mining card (P104-100 ~GTX1070) but my second A770 16Gb is much faster, but cant do cuda so a lot of tweaking needed to use the intel Arc AI acceleration.
Im currently working to make skyrim chatgpt all npc mod to run localy. (now i use openai.com API but that is not free) That need a second cuda capable GPU to run offline.