58
submitted 11 months ago* (last edited 11 months ago) by meteokr@community.adiquaints.moe to c/localllama@sh.itjust.works

Seems like a really cool project. Lowering the barrier to entry of locally run models. As llamacpp supports a ton of models, I imagine it be easy to adapt this for other models other than the prebuilt ones.

you are viewing a single comment's thread
view the rest of the comments
[-] kurwa@lemmy.world 6 points 11 months ago

That's pretty dope. I've been imagining some programs where having a small enough locally runnable model could be useful.

this post was submitted on 03 Dec 2023
58 points (96.8% liked)

LocalLLaMA

2249 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS