1922
The dream (lemmy.world)
you are viewing a single comment's thread
view the rest of the comments
[-] CeeBee@lemmy.world 37 points 11 months ago

It's getting there. In the next few years as hardware gets better and models get more efficient we'll be able to run these systems entirely locally.

I'm already doing it, but I have some higher end hardware.

[-] Xanaus@lemmy.ml 4 points 11 months ago

Could you please share your process for us mortals ?

[-] CeeBee@lemmy.world 6 points 11 months ago

Stable diffusion SXDL Turbo model running in Automatic1111 for image generation.

Ollama with Ollama-webui for an LLM. I like the Solar:7b model. It's lightweight, fast, and gives really good results.

I have some beefy hardware that I run it on, but it's not necessary to have.

[-] Ookami38@sh.itjust.works 2 points 11 months ago

Depends on what AI you're looking for. I don't know of an LLM (a language model,think chatgpt) that works decently on personal hardware, but I also haven't really looked. For art generation though, look up automatic1111 installation instructions for stable diffusion. If you have a decent GPU (I was running it on a 1060 slowly til I upgraded) it's a simple enough process to get started, there's tons of info online about it, and it's all run on local hardware.

[-] CeeBee@lemmy.world 2 points 11 months ago

I don't know of an LLM that works decently on personal hardware

Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.

[-] ParetoOptimalDev@lemmy.today 1 points 10 months ago

If you have really low specs use the recently open sourced Microsoft Phi model.

this post was submitted on 25 Dec 2023
1922 points (97.9% liked)

People Twitter

5283 readers
223 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying or international politcs
  5. Be excellent to each other.

founded 1 year ago
MODERATORS