117
submitted 1 year ago by Alfa@lemmy.world to c/chatgpt@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] Karmmah@lemmy.world 72 points 1 year ago

I also noticed that chatGPT can't actually correct itself. It just says "oh sorry, here's something different" and gives you another crap answer. I noticed it with code specifically. If I remember correctly it was better when it was brand new.

[-] squiblet@kbin.social 25 points 1 year ago

The apology thing is sort of hilarious. I wonder what exactly they did to make it eternally apologetic. There was an article on HN recently about how it is basically impossible to get Chat GPT to stop apologizing, as in, if you ask it to stop, it will apologize for apologizing.

[-] relevants@feddit.de 8 points 1 year ago

It's because humans have rated potential responses and ChatGPT has been trained to generate the kind of responses that most consistently get preferred rating. You can imagine how an AI trained to say what people want to hear would become a people pleaser.

load more comments (13 replies)
load more comments (16 replies)
this post was submitted on 03 Aug 2023
117 points (89.8% liked)

ChatGPT

8935 readers
1 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS