this post was submitted on 26 Aug 2025
70 points (98.6% liked)

Tech

1836 readers
229 users here now

A community for high quality news and discussion around technological advancements and changes

Things that fit:

Things that don't fit

Community Wiki

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] towerful@programming.dev 3 points 1 day ago (1 children)

Programming isn't about syntax or language.
LLMs can't do problem solving.
Once a problem has been solved, the syntax and language is easy.
But reasoning about the problem is the hard part.

Like the classic case of "how many 'r's in 'strawberry'", LLMs would state 2 occurrences.

Just check googles AI Mode.
The strawberry problem was found and reported on, and has been specifically solved.

Promoted how many 'r's in the word 'strawberry':

There are three 'r's in the word 'strawberry'. The letters are: S-T-R-A-W-B-E-R-R-Y.

Prompted how many 'c's in the word 'occurrence':

The word "occurrence" has two occurrences of the letter 'c'.

So, the specific case has been solved. But not the problem.
In fact, I could slightly alter my prompt and get either 2 or 3 as the answer.

[โ€“] Part4@infosec.pub 1 points 1 day ago* (last edited 1 day ago)

None of this contradicts anything in my post.

Edit - but I will add that the ai agent is written to manage the limitations of the LLM. To do the kind of 'thinking' (they don't really think) the LLM can't do, in a very loose sense (to try and briefly address the point in your post).