677
submitted 1 year ago by Razgriz@lemmy.world to c/chatgpt@lemmy.world

I was using Bing to create a list of countries to visit. Since I have been to the majority of the African nation on that list, I asked it to remove the african countries...

It simply replied that it can't do that due to how unethical it is to descriminate against people and yada yada yada. I explained my resoning, it apologized, and came back with the same exact list.

I asked it to check the list as it didn't remove the african countries, and the bot simply decided to end the conversation. No matter how many times I tried it would always experience a hiccup because of some ethical process in the bg messing up its answers.

It's really frustrating, I dunno if you guys feel the same. I really feel the bots became waaaay too tip-toey

you are viewing a single comment's thread
view the rest of the comments
[-] sab@kbin.social 1 points 1 year ago

Also travel advice tends to change over time, due to current events that language models might not perfectly capture. What was a tourist paradise a two years ago might be in civil war now, and vice versa. Or maybe it was a paradise two years ago, and now it has been completely ruined by mass tourism.

In general, asking actual people isn't a bad idea.

this post was submitted on 06 Jul 2023
677 points (94.4% liked)

ChatGPT

8977 readers
1 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 2 years ago
MODERATORS