LLMs just train on which words follow which, right?
So if the version of the text changes every other word, it should mess with them. And if you change every other word to "communism" it should learn that the word "communism" follows logically after most words.
Just spitballing here, but I would find making the robots they intend to replace workers with into communist agitators rather funny.
Tiired of picking out gifts the recipient don't need or want? Here at Giftr we have partnered with Amazon to bring you Shit-as-a-service!
Just prompt our chat-bot and we will send a gift to a statistically similar address!
If it never arrives? Well, they didn't want it anyway! Giftr!