366
you are viewing a single comment's thread
view the rest of the comments
[-] Scoopta@programming.dev 3 points 1 month ago* (last edited 1 month ago)

Ollama is also a cool way of running multiple models locally

[-] Retro_unlimited@lemmy.world 1 points 1 month ago

That might be the other one I run, I forget because it’s on my server as a virtual machine (rtx 3080 pass through), but I haven’t used it in a long time.

this post was submitted on 01 Oct 2024
366 points (91.0% liked)

Programmer Humor

19551 readers
1023 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS