439
submitted 1 year ago by imgel@lemmy.ml to c/linux@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] Senseless@feddit.de 53 points 1 year ago

"AI assistant" just seems like a euphemism for "increased tracking".

[-] AccidentalLemming@lemmy.world 34 points 1 year ago

AI assistants can hypothetically work completely offline without sending your requests to a remote server. Hypothetically.

[-] boonhet@lemm.ee 7 points 1 year ago* (last edited 1 year ago)

Imagine a standardized API where you provide either your own LLM running locally, your own LLM running in your server (for enthusiasts or companies), or a 3rd party LLM service over the Internet, for your optional AI assistant that you can easily disable.

Regardless of your DE, you could choose if you want an AI assistant and where you want the model to run.

[-] hackris@lemmy.ml 4 points 1 year ago

I've had this idea for a long time now, but I don't know shit about LLMs. GPT can be run locally though, so I guess only the API part is needed.

[-] boonhet@lemm.ee 3 points 1 year ago

I've run LLMs locally before, it's the unified API for digital assistants that would be interesting to me. Then we'd just need an easy way to acquire LLMs that laymen could use, but probably any bigger DE or distro can create a setup wizard.

Check out koboldAI and koboldassistabt projects. That's Litterally the thing you are describing and is Open source

[-] AccidentalLemming@lemmy.world 1 points 1 year ago

But can it do useful things, and does it run well on average hardware?

[-] superguy@lemm.ee 4 points 1 year ago

Yeah. I'm really annoyed by this trend of having programs that could function offline require connecting to a server.

[-] AccidentalLemming@lemmy.world 1 points 1 year ago

It's not as much of a trend as much as it is companies loving software-as-a-service because it gives them a lot of control and in many cases a recurring income.

Not just hypothetically but practically too. A foss program called koboldai let's you run LLMs locally on your computer and a project that takes advantage of this is the koboldassistant project. You can essentially make your own Alexa,Cortana,Siri whatever that doesn't collect your data and belongs to you

[-] taanegl@beehaw.org 1 points 1 year ago

Open source locally run LLM that runs on GPU or dedicated PCIe open hardware that doesn't touch the cloud...

this post was submitted on 15 Oct 2023
439 points (96.2% liked)

Linux

48372 readers
1308 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS