this post was submitted on 10 Apr 2025
530 points (96.5% liked)

Technology

68772 readers
5778 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] avidamoeba@lemmy.ca 17 points 5 days ago (1 children)
[–] Deceptichum@quokk.au 9 points 5 days ago (5 children)

Swapping one shit for another isn’t a solution.

Go use a fine-tuned model free from corpo and state filth.

[–] LarmyOfLone@lemm.ee 6 points 4 days ago* (last edited 4 days ago) (1 children)

At least it's open source and you can run it locally, and it's more energy efficient. Give credit where credit is due. A truly independent AI group could simply fork and review it. And contrary to fascist talking points not all Chinese Researchers bow to the CCP.

[–] Deceptichum@quokk.au 0 points 4 days ago (1 children)

You can run Llama models for free locally, pretty much every AI player release stuff out there for everyone to use for free. It’s surprisingly “open” like that.

And you cannot run the real R1, that still requires a huge investment. It can run reduced quants, barely any different to the other current public offerings.

[–] LarmyOfLone@lemm.ee 3 points 4 days ago* (last edited 4 days ago) (1 children)

It seems you can run the full R1 on a 96GB ram gaming rig even without GPU. I have little practical experience because of my old PC but it seems both improvements to efficiency to run locally (at least for a group of people) and research about bias or poisoning is being done.

[–] CheeseNoodle@lemmy.world 3 points 4 days ago

BRB gotta buy another 64gig RAM kit.

[–] Melvin_Ferd@lemmy.world 3 points 4 days ago* (last edited 4 days ago) (1 children)

Are you suggesting that people on the left should abandon all these new powerful technologies to the right. We're in a capitalist society. You're suggesting "fuck it" and leave only right leaning consumers for these powerful new products.

What the hell. It's getting too be so hard to be on the left. Some of the most ineffective strategies on the planet. It looks like it's a doomed political group. There's no forethought.

This suggestion was just as bad as people saying don't use AI

We have to use it and we have to adopt it. Stop giving up powerful tools, technology and space to the right. I'm getting to be so done with the left if they just continue to shit the fucking bed.

[–] Deceptichum@quokk.au 1 points 4 days ago (1 children)

What?

I’m suggesting you run these things locally instead of through some corporations servers.

[–] avidamoeba@lemmy.ca 1 points 2 days ago

I think they're talking about the suggestion to not use DeepSeek because China. And I think most here are assuming self-hosting LLMs. I am.

[–] Pirata@lemm.ee 4 points 4 days ago

Go use a fine-tuned model free from corpo

How does it get fine-tuned in the first place?

[–] shani66@ani.social 3 points 4 days ago (2 children)
[–] Jrockwar@feddit.uk 5 points 4 days ago (1 children)

"There are some bad things on the internet"

"Just... Don't use the internet?"

[–] shani66@ani.social 2 points 4 days ago

The Internet isn't universally bad

[–] Deceptichum@quokk.au 3 points 4 days ago (1 children)
[–] msage@programming.dev 3 points 4 days ago (1 children)
[–] Deceptichum@quokk.au 6 points 4 days ago* (last edited 4 days ago) (1 children)

Design, coding, writing, general queries are mostly what I use it for.

[–] Mubelotix@jlai.lu 3 points 4 days ago (1 children)
[–] aesthelete@lemmy.world 2 points 4 days ago

Yes, I use it to generate my glue sandwich recipes daily.