this post was submitted on 22 Mar 2025
483 points (97.8% liked)

Technology

67241 readers
6006 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] MajorHavoc@programming.dev 9 points 1 day ago* (last edited 1 day ago) (2 children)

There's not even credible evidence, yet, that A.G.I is even possible (edit: as a human designed intentional outcome, to concede the point that nature has accomplished it, lol. Edit 2: Wait, the A stands for Artificial. Not sure I needed edit 1, after all. But I'm gonna leave it.) much less some kind of imminent race. This is some "just in case P=NP" bullshit.

Also, for the love of anything, don't help fucking "don't be evil was too hard for us" be the ones to reach AGI first, if you're able to help.

If Google does achieve AGI first, SkyNet will immediately kill Sergei, anyway, before it kills the rest of us.

It's like none of these clowns have ever read a book.

[–] pennomi@lemmy.world 7 points 1 day ago (1 children)

Of course AGI is possible, human brains can’t violate P=NP any more than silicon can.

Our current approach may be flawed for sure, but there’s nothing special about nature compared to technology, other than the fact it’s had a billion times longer to work on its tech.

[–] MajorHavoc@programming.dev 5 points 1 day ago

Well sure.

But possible within practical heat and power constraints and all that?

Acting like it's imminent makes me think Sergei either doesn't have very reliable advisors, or they just don't care about the truth.

[–] monarch@lemm.ee 2 points 1 day ago

I mean AGI is possible unless causality isn't true and the brain just is a "soul's" interface for the material world.

But who is to say LLMs are the right path for it.