3
submitted 1 year ago by Ajnart@alien.top to c/main@selfhosted.forum

Hello ! I am looking for a self-hosted or self-hostable nlp (like llama or chatgpt) but very very small one that could work with as low as ~300 mb of ram. It needs to have an API. I plan to integrate it into my dashboard project you might know, homarr. I'd like to make some kind of assistant to directly help within the app by using its integrations capabilities.

The tool needs to be self-hosted so that users won't leak the queries to anyone. A freemium service that you can either self-host or pay for would also work.

It does not need to have a huge knowledge base (doesn't need to know a good lobster recipe) , just to be able understand basic language inputs and in turn I will make it communicate with the key parts of the app

I apologize if this is not worded properly as I am fairly new to the world of LLMs.

you are viewing a single comment's thread
view the rest of the comments
[-] AK1174@alien.top 1 points 1 year ago

you could use LocalAI or ollama. but neither is going to work with 300mb of ram, and it needs a bunch compute resources for response speed to be usable. these models are also not very capable, in comparison to openAI’s gpt’s, but that depends on what your goal is with the models.

this post was submitted on 09 Nov 2023
3 points (100.0% liked)

Self-Hosted Main

504 readers
1 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

For Example

We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.

Useful Lists

founded 1 year ago
MODERATORS