this post was submitted on 03 Aug 2025
730 points (98.2% liked)
Programmer Humor
37754 readers
259 users here now
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I can assure you LLMs are, in general, not great at extracting concepts and work upon it, like a human mind is. LLMs are statistical parrots, that have learned to associate queries with certain output patterns like code chunks, or text chunks, etc. They are not really intelligent, certainly not like a human is. They cannot follow instructions like a human does, because of this. Problem is, they seem just intelligent enough that they can fool someone wanting to believe them to be intelligent even though there is no intelligence, by any measure, behind their replies.
It also doesn't help that you have AGI evangelists like Yarvin and Musk who keep saying that the techno-singularity/techno-god is the ONLY WAY TO SAVE US, and that we're RIGHT ON THE EDGE, so a lot of dumb fucks see that and go "well obviously this is like querying an average human mind which has access to all of human knowledge for the problem if superhuman jntelligence is right around the corner."