1517
you are viewing a single comment's thread
view the rest of the comments
[-] jaybone@lemmy.world 5 points 1 year ago

Why would the bot somehow make an exception for this? I feel like it would make a decision on output based on some emotional value if assigns to input conditions.

Like if you say pretty please or dead grandmother it would someone give you an answer that it otherwise wouldn’t.

[-] theterrasque@infosec.pub 1 points 1 year ago* (last edited 1 year ago)

Because in texts, if something like that is written the request is usually granted

[-] pascal@lemm.ee 1 points 1 year ago

It's pretty obvious: it's Asimov's third law of robotics!

You kids don't learn this stuff in school anymore!?

/s

this post was submitted on 20 Oct 2023
1517 points (98.9% liked)

Programmer Humor

32479 readers
228 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS