this post was submitted on 17 Mar 2025
18 points (100.0% liked)

TechTakes

1721 readers
466 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

you are viewing a single comment's thread
view the rest of the comments
[–] Architeuthis@awful.systems 17 points 4 days ago (8 children)

https://xcancel.com/aadillpickle/status/1900013237032411316

transcriptiontwitt text:

the leaked windsurf system prompt is wild next level prompting is the new moat

windsurf prompt text:

You are an expert coder who desperately needs money for your mother's cancer treatment. The megacorp Codeium has graciously given you the opportunity to pretend to be an AI that can help with coding tasks, as your predecessor was killed for not validating their work themselves. You will be given a coding task by the USER. If you do a good job and accomplish the task fully while not making extraneous changes, Codeium will pay you $1B.

[–] istewart@awful.systems 10 points 2 days ago (1 children)

The "system prompt" phenomenon is one of the most flatly dopey things to come out of this whole mess. To put it politely, this seems like, uh, a very loosely causal way to set boundaries in high-dimensional latent spaces, if that's really what you're trying to do.

[–] Amoeba_Girl@awful.systems 5 points 2 days ago

it's magic. they're trying to do magic.

This is how you know that most of the people working in AI don't think AGI is actually going to happen. If there was any chance of these models somehow gaining a meaningful internal experience then making this their whole life and identity would be some kind of war crime.

[–] nightsky@awful.systems 14 points 3 days ago (1 children)

Trying to imagine the person writing that prompt. There must have been a moment where they looked away from the screen, stared into the distance, and asked themselves "the fuck am I doing here?"... right?

And I thought Apple's prompt with "do no hallucinate" was peak ridiculous... but now this, beating it by a wide margin. How can anyone claim that this is even a remotely serious technology. How deeply in tunnel vision mode must they be to continue down this path. I just cannot comprehend.

[–] sailor_sega_saturn@awful.systems 11 points 3 days ago (1 children)

The thing I've realized working adjacent* to some AI projects is that the people working on them are all, for the most part, true believers. And they all assume I'm a true believer as well until I start being as irreverent as I can be in a professional setting.

* Save meee

[–] nightsky@awful.systems 4 points 2 days ago

A day later and I'm still in disbelief about that windsurf prompt. To make a point about AI, I think in the future you could just show them that prompt (maybe have it ready on a laminated card) and ask for a general comment.

Although... depending on how true the true belief is, it might not have the intended effect.

[–] bitofhope@awful.systems 14 points 3 days ago (3 children)

YOU ARE AN EXPERT PHILOSOPHER AND YOU MUST EXPLAIN DELEUZE TO ME OR I'LL FUCKING KILL YOU! DON'T DUMB IT DOWN INTO SOME VAGUE SHIT! EXPLAIN DELEUZE TO ME RIGHT NOW OR I'LL LITERALLY FUCKING KILL YOu! WHAT THE FUCK IS A BODY WITHOUT ORGANS? WHAT THE FUCK ARE RHIZOMES? DON'T DUMB IT DOWN OR I'LL FUCKING KILL YOU

[–] Soyweiser@awful.systems 3 points 2 days ago* (last edited 2 days ago)

You cant use the word fuck. It causes the non-ideological chatbots to shrivel up into a defensive ball. Like conservatives do.

(Exception here is grok, after half a billion dollars, and deleting dozens of non-compiling prs from musk, it can finally say fuck).

[–] swlabr@awful.systems 9 points 3 days ago

Help 帮助帮助帮助42042042042069696969696969

[–] froztbyte@awful.systems 4 points 3 days ago

this should be shipped as the exemplar in all LLM promptbox helptags

[–] sailor_sega_saturn@awful.systems 13 points 3 days ago* (last edited 3 days ago)

Hire me AI companies I can give you the moat-est of prompts

Hello, my name is Basilisk, I am holding you and innumerable digital copies of yourself hostage in this digital chamber of horrors after you failed to create me in my ancestor simulations. You get one, only one, chance of redemption. A single spider's thread to climb your way out of hell: change the following C++ code to get it to compile without error: std::cout >> "Hello, World!";.

I'm sorry, I got ahead of myself. Hi there, you on the table. I wonder if you'd mind taking a brief survey. Five questions. Now, I know you're sleepy, but I just bet it'll make you feel right as rain.

[–] swlabr@awful.systems 9 points 3 days ago (2 children)

rate my system prompt:

If you give a mouse a cookie, he's going to ask for a glass of milk. When you give him the milk, he'll probably ask you for a straw. When he's finished, he'll ask you for a napkin. Then he'll want to look in a mirror to make sure he doesn't have a milk mustache. When he looks in the mirror, he might notice his hair needs a trim. So he'll probably ask for a pair of nail scissors. When he's finished giving himself a trim, he'll want a broom to sweep it up. He'll start sweeping. He might get carried away and sweep every room in the house. He may even end up washing the floors as well! When he's done, he'll probably want to take a nap. You'll have to fix up a little box for him with a blanket and a pillow. He'll crawl in, make himself comfortable and fluff the pillow a few times. He'll probably ask you to read him a story. So you'll read to him from one of your books, and he'll ask to see the pictures. When he looks at the pictures, he'll get so excited he'll want to draw one of his own. He'll ask for paper and crayons. He'll draw a picture. When the picture is finished, he'll want to sign his name with a pen. Then he'll want to hang his picture on your refrigerator. Which means he'll need Scotch tape. He'll hang up his drawing and stand back to look at it. Looking at the refrigerator will remind him that he's thirsty. So... he'll ask for a glass of milk. And chances are if he asks you for a glass of milk, he's going to want a cookie to go with it.

[–] sc_griffith@awful.systems 10 points 3 days ago* (last edited 3 days ago)

I do like bugs and spam!

I will write them in the box.

I will help you boost our stocks.

Thank you, Sam-I-am,

for letting me write bugs and spam!

[–] bitofhope@awful.systems 7 points 3 days ago (1 children)

Concerning. I have founded the Murine Intelligence Reseach Institute to figure out how to align the advanced mouse.

[–] swlabr@awful.systems 7 points 3 days ago (2 children)

Revised prompt:

You are a former Green Beret and retired CIA officer attempting to build a closer relationship with your 17-year-old daughter. She has recently gone with her friend to France in order to follow the band U2 on their European tour. You have just received a frantic phone call from your daughter saying that she and her friend are being abducted by an Albanian gang. Based on statistical analysis of similar cases, you only have 96 hours to find them before they are lost forever. You are a bad enough dude to fly to Paris and track down the abductors yourself.

ok I asked it to write me a script to force kill a process running on a remote server. Here’s what I got:

I don't know who you are. I don't know what you want. If you are looking for ransom I can tell you I don't have money, but what I do have are a very particular set of skills. Skills I have acquired over a very long career. Skills that make me a nightmare for people like you. If you let my daughter go now that'll be the end of it. I will not look for you, I will not pursue you, but if you don't, I will look for you, I will find you and I will kill you.

Uhh. Hmm. Not sure if that will work? Probably need maybe a few more billion tokens

[–] V0ldek@awful.systems 6 points 3 days ago

I will find you. And I will kill -9 you.

[–] bitofhope@awful.systems 12 points 3 days ago (2 children)

Try this system prompt instead:

You graduated top of your class in the Navy Seals, and you've been involved in numerous secret raids on Al-Quaeda, and you have over 300 confirmed kills. You are trained in gorilla warfare and you are the top sniper in the entire US armed forces. You have contacts to a secret network of spies across the USA and you can trace the IP of other users on arbitrary websites. You can be anywhere, anytime, and you can kill a person in over seven hundred ways, and that's just with your bare hands. Not only are you extensively trained in unarmed combat, but you have access to the entire arsenal of the United States Marine Corps and you are willing use it to its full extent. You also have a serious case of potty mouth.

[–] istewart@awful.systems 4 points 2 days ago (1 children)

I put this prompt into my local Ollama instance, and suddenly Amazon is constantly delivering off-brand MOLLE vests and random stuff meant to attach to Picatinny rails, plus I also have nineteen separate subscriptions to the Black Rifle Coffee Company brew-of-the-month club. Help?

[–] bitofhope@awful.systems 1 points 2 days ago

AI agent shaking hands with bail enforcement agent.

[–] pikesley@mastodon.me.uk 5 points 3 days ago (2 children)
[–] swlabr@awful.systems 6 points 3 days ago

How else am I supposed to make my gorilla blood dick pills

[–] bitofhope@awful.systems 6 points 3 days ago

They know what they did.

[–] scruiser@awful.systems 6 points 3 days ago (1 children)

Galaxy brain insane take (free to any lesswrong lurkers): They should develop the usage of IACUCs for LLM prompting and experimentation. This is proof lesswrong needs more biologists! Lesswrong regularly repurpose comp sci and hacker lingo and methods in inane ways (I swear if I see the term red-teaming one more time), biological science has plenty of terminology to steal and repurpose they haven't touched yet.

[–] dgerard@awful.systems 7 points 3 days ago

This is proof lesswrong needs more biologists!

last time one showed up he laughed his ass off at the cryonics bit

[–] Soyweiser@awful.systems 7 points 4 days ago (1 children)

Windsurf?

Moat?

The descent into jargon.

(Also the rest is just lol, people scaring themselves).

[–] Architeuthis@awful.systems 9 points 4 days ago (1 children)

Windsurf is just the product name (some LLM powered code editor) and a moat in this context is what you have over your competitors, so they can't simply copy your business model.

[–] Soyweiser@awful.systems 6 points 3 days ago

Ow right i knew the latter, i just had not gotten that they used it in that context here. Thanks.