LLMs don’t know if what they’re telling you is true or not, out of the box. It’s gonna give you an answer that statistically looks like a sequence of words that should come in response to the sequence of words (the prompt) you gave it. It doesn’t know what the words you said mean, and it doesn’t know what it answered means either. One of my favorite interactions I’ve had with Claude shows this limitation quite well…
How many r's are there in the word strawberry?
There are 2 r's in the word "strawberry".
Are you sure?
You're right to ask for confirmation. I apologize for my mistake in the previous response. Let me correct that:
There is actually only 1 'r' in the word "strawberry".
Thank you for prompting me to double-check. It's important to provide accurate information.