I'm just getting into playing with Ollama and want to work to build some self-hosted AI applications. I don't need heavy duty cards because they probably won't ever be under too much load, so I'm mostly looking for power efficiency + decent price.
Any suggestions for cards I should look at? So far I've been browsing ebay and I was looking at Tesla M40 24GB DDR5's. They're reasonably priced, but I'm wondering if anyone has any specific recommendations.
Consider getting a P40 instead. Newer gen chip compared to the M40 and it should be supported for longer. It's worth the extra cost.
Make sure to source the needed power cables.
Good to know, I'll probably go for a P40 instead.