8
Best GPUs for self-hosted AI?
(alien.top)
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
For Example
We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.
Useful Lists
Tesla P40 is a good low budget option, it has 24gb and CUDA cores. I’ve tried running 13b LLMs with 1 and it did well, plus you can afford multiple if you have enough slots