this post was submitted on 22 Mar 2025
398 points (97.8% liked)

Technology

67151 readers
4135 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] MajorHavoc@programming.dev 7 points 19 hours ago* (last edited 18 hours ago) (3 children)

There's not even credible evidence, yet, that A.G.I is even possible (edit: as a human designed intentional outcome, to concede the point that nature has accomplished it, lol. Edit 2: Wait, the A stands for Artificial. Not sure I needed edit 1, after all. But I'm gonna leave it.) much less some kind of imminent race. This is some "just in case P=NP" bullshit.

Also, for the love of anything, don't help fucking "don't be evil was too hard for us" be the ones to reach AGI first, if you're able to help.

If Google does achieve AGI first, SkyNet will immediately kill Sergei, anyway, before it kills the rest of us.

It's like none of these clowns have ever read a book.

[โ€“] monarch@lemm.ee 2 points 15 hours ago

I mean AGI is possible unless causality isn't true and the brain just is a "soul's" interface for the material world.

But who is to say LLMs are the right path for it.

load more comments (2 replies)