677
submitted 1 year ago by Razgriz@lemmy.world to c/chatgpt@lemmy.world

I was using Bing to create a list of countries to visit. Since I have been to the majority of the African nation on that list, I asked it to remove the african countries...

It simply replied that it can't do that due to how unethical it is to descriminate against people and yada yada yada. I explained my resoning, it apologized, and came back with the same exact list.

I asked it to check the list as it didn't remove the african countries, and the bot simply decided to end the conversation. No matter how many times I tried it would always experience a hiccup because of some ethical process in the bg messing up its answers.

It's really frustrating, I dunno if you guys feel the same. I really feel the bots became waaaay too tip-toey

you are viewing a single comment's thread
view the rest of the comments
[-] st3ph3n@kbin.social 28 points 1 year ago

I tried to have it create an image of a 2022 model Subaru Baja if it was designed by an idiot. It refused on the ground that it would be insulting to the designers of the car... even though no such car exists. I tried reasoning with it and not using the term idiot, but it refused. Useless.

[-] Lommy@lemmy.world 8 points 1 year ago

You mean the same designer of the 2023 WRX?

[-] BukoSpooko@midwest.social 6 points 1 year ago

Yeesh, you weren't kidding. What a bland and unlovable design.

[-] Stovetop@lemmy.world 6 points 1 year ago

I had to look it up. Wow, what a milquetoast early 2000's sort of design. No, Subaru. Painting it orange does not make it cool.

load more comments (1 replies)
load more comments (1 replies)
this post was submitted on 06 Jul 2023
677 points (94.4% liked)

ChatGPT

8912 readers
1 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS