this post was submitted on 16 Aug 2025
81 points (93.5% liked)
Facepalm
3356 readers
164 users here now
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Ugh.
No, it isn’t. It’s not even close.
LLMs do not count. Large Language Models don’t even really think; they’re just good at mimicking patterns. They don’t understand what they’re telling you.
Claiming that emergence is real only tells me that OOP doesn’t know what AGI (Artificial General Intelligence) is. Emergence requires AGI, and no one has developed AGI that’s anywhere near ready for prime time. There’s no guarantee that anyone ever will, regardless of how much money they throw at the problem. For all we know, even our best, most well thought out approaches are completely wrong. We are learning as we go. Anyone who tells you otherwise is either uninformed or a charlatan. I can’t overstate how difficult this problem is.
I’m into science fiction. I’d love to see AGI. I’m not one of those people who shoots down ideas for fun. I have to be realistic, though. Even relatively simple LLMs have been more difficult and complex than experts predicted. I think having a “wait and see” attitude is the only attitude that makes sense.