394
submitted 1 month ago by Cr4yfish@lemmy.world to c/opensource@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] 01189998819991197253@infosec.pub 17 points 1 month ago* (last edited 1 month ago)

This is a really great use of LLM! Seriously great job! Once it's fully self-hostable (including the LLM model), I will absolutely find it space on the home server. Maybe using Rupeshs fastdcpu as the model and generation backend could work. I don't remember what his license is, though.

Edit: added link.

[-] Cr4yfish@lemmy.world 10 points 1 month ago

Thanks! I'm already eyeing ollama for this.

this post was submitted on 03 Nov 2024
394 points (98.0% liked)

Open Source

31712 readers
246 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS