132
submitted 4 months ago* (last edited 4 months ago) by Timely_Jellyfish_2077@programming.dev to c/chatgpt@lemmy.world

Small rant : Basically, the title. Instead of answering every question, if it instead said it doesn't know the answer, it would have been trustworthy.

you are viewing a single comment's thread
view the rest of the comments
[-] Puttaneska@lemmy.world 1 points 4 months ago

It seems that ChatGPT does sometimes know that what it’s offered is wrong and actually knows a better answer when challenged.

I’ve often asked for code help, which hasn’t worked. Then I’ve gone to other sources and found that ChatGPT has been wrong about something and there’s an alternative way. When this is put back to ChatGPT, it says that I’m correct (x can’t do y) and offers a perfect solution.

So it looks like it does sometimes know what it appears to not know, but inexplicably doesn’t give the correct info immediately.

[-] Nougat@fedia.io 1 points 4 months ago

No, it’s responding to your comment suggesting something different by giving you something different. It has no idea what’s correct or incorrect. You do, so when you give it input that you know is more correct, of course it’s going to respond by telling you you’re right.

Try feeding it incorrect answers as though they are correct and see what happens.

this post was submitted on 29 Jun 2024
132 points (91.2% liked)

ChatGPT

8912 readers
1 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS