1204
submitted 3 months ago by ElCanut@jlai.lu to c/programmerhumor@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] eestileib@sh.itjust.works 28 points 3 months ago

LLM system input is unsanitizable, according to NVidia:

The control-data plane confusion inherent in current LLMs means that prompt injection attacks are common, cannot be effectively mitigated, and enable malicious users to take control of the LLM and force it to produce arbitrary malicious outputs with a very high likelihood of success.

https://developer.nvidia.com/blog/securing-llm-systems-against-prompt-injection/

[-] MalReynolds@slrpnk.net 2 points 3 months ago

Everything old is new again (GIGO)

this post was submitted on 07 Jun 2024
1204 points (92.7% liked)

Programmer Humor

32054 readers
1336 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS