this post was submitted on 10 Jun 2025
74 points (94.0% liked)
Programming
20881 readers
88 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
And how do you know LLMs can't tell that they are involved in a conversation?
Unless you think there is something non-computational in the human brain, then you must accept that computers are - in theory - capable of thinking. With the right software and sufficiently powerful hardware.
Given that truth (which I think you can only avoid through religion or quantum quackery), you can't just say "it's only maths; it can't be thinking" because we know that maths can think.
Do LLMs "think"? The definition of "think" is wooly enough and we understand them little enough that it's quite an assertion to say that they definitely don't.
It has no memory, for one. What makes you think that it does know its in a conversation?
It has very short term memory in the form of it's token context. Especially with something like Meta's Coconut.
I don't really. Yet. But I also don't think that it is fundamentally impossible for LLMs to think, like you seem to. I also don't think the definition of the word "think" is so narrow that it requires that level of self-awareness. Do you think a mouse is really aware it is a mouse? What about a spider?