1
submitted 1 year ago by bia@lemmy.ml to c/localllama@sh.itjust.works

I’ve been using llama.cpp, gpt-llama and chatbot-ui for a while now, and I’m very happy with it. However, I’m now looking into a more stable setup using only GPU. Is this llama.cpp still still a good candidate for that?

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here
this post was submitted on 10 Jun 2023
1 points (100.0% liked)

LocalLLaMA

2244 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS