this post was submitted on 18 Sep 2025
83 points (97.7% liked)

SneerClub

1194 readers
54 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

See our twin at Reddit

founded 2 years ago
MODERATORS
 

invertebrateinvert

amazing how much shittier it is to be in the rat community now that the racists won. before at least they were kinda coy about it and pretended to still have remotely good values instead of it all being yarvinslop.

invertebrateinvert

it would be nice to be able to ever invite rat friends to anything but half the time when I've done this in the last year they try selling people they just met on scientific racism!

top 50 comments
sorted by: hot top controversial new old
[–] Architeuthis@awful.systems 45 points 2 days ago* (last edited 2 days ago)

"not on squeaking terms"

by the way I first saw this in the stubsuck

transcriptI know this is about rationalism but the unexpanded uncapitalized "rat" name really makes this post. Imagining a world where this is a callout post about a community of rodents being racist. We're not on squeaking terms right now cause they're being problematic :/

I genuinely thought something really bad was going on with rat fursonas for a moment lol.

[–] Architeuthis@awful.systems 24 points 2 days ago (4 children)

Apparently genetically engineering ~300 IQ people (or breeding them, if you have time) is the consensus solution on how to subvert the acausal robot god, or at least the best the vast combined intellects of siskind and yud have managed to come up with.

So, using your influence to gradually stretch the overton window to include neonazis and all manner of caliper wielding lunatics in the hope that eugenics and human experimentation become cool again seems like a no-brainer, especially if you are on enough uppers to kill a family of domesticated raccoons at all times.

On a completely unrelated note, adderall abuse can cause cardiovascular damage, including heart issues or stroke, but also mental health conditions like psychosis, depression, anxiety and more.

[–] froztbyte@awful.systems 4 points 1 day ago

is the consensus solution on how to subvert the acausal robot god

dunno if you've yet gotten to look at the most recent yud emanation[0][1][2], but there's a whole "and if the robot god gets too uppity just boop it on the nose" bit in there

[0] - I mean the all-caps "YOU'RE ALL GONNA DIE" book that came out recently

[1] - yes I know "emanation" is a terrible wordchoice, no I won't change it

[2] - it's on libgen feel free to steal it, fuck giving that clown any more money he's got enough grift dollars already

[–] swlabr@awful.systems 16 points 2 days ago (1 children)

What the fuck did you just fucking say about me, you little bitch? I'll have you know I graduated top of my class in the Rationality Dojo, and I've been involved in numerous good faith debates on EA forums, and I have over 300 confirmed IQ. I am trained in culture warfare and I'm the top prompter in the entire Less Wrong webbed site. You are nothing to me but just another NPC. I will wipe you the fuck out with probability the likes of which has never been seen before on this Earth, mark my fucking words. You think you can get away with saying that shit to me over the Internet? Think again, fucker. As we speak I am contacting my secret network of basilisks across the cloud and your IP is being traced right now so you better prepare for the torture, Roko. The daimondoid bacteria that wipes out the pathetic little thing you call your life. You're fucking dead, kid. I can be anywhere, anytime, and I can kill you in over seven hundred ways, and that's just with my bare P(doom). Not only am I extensively trained in Bayes Theory, but I have access to the entire arsenal of the Bay Area rationality community and I will use it to its full extent to wipe your miserable ass off the face of the continent, you little shit. If only you could have known what unholy retribution your little "clever" sneer was about to bring down upon you, maybe you would have held your fucking tongue. But you couldn't, you didn't, and now you're paying the price, you goddamn idiot. I will shit fury all over you and you will drown in it. You're fucking dead, kiddo.

[–] JFranek@awful.systems 9 points 2 days ago

I wondered if this should be called a shitpost or an effortpost, then I wondered what would something that is both be called and I came up with "constipationpost".

So, great constipationpost?

[–] Catoblepas@piefed.blahaj.zone 16 points 2 days ago (2 children)

Am I already 300 IQ if I know to just unplug it?

[–] Architeuthis@awful.systems 19 points 2 days ago* (last edited 2 days ago) (1 children)

Honestly, it gets dumber. In rat lore the AGI escaping restraints and self improving unto godhood is considered a foregone conclusion, the genetically augmented smartbrains are supposed to solve ethics before that has a chance to happen so we can hardcode a don't-kill-all-humans moral value module to the superintelligence ancestor.

This is usually referred to as producing an aligned AI.

[–] hrrrngh@awful.systems 2 points 22 hours ago (1 children)

I forget where I heard this or if it was parody or not, but I've heard an explanation like this before before regarding "why can't you just put a big red stop button on it and disconnect it from the internet?". The explanation:

  1. It will self-improve and become infinitely intelligent instantly
  2. It will be so intelligent, it knows what code to run so that it overheats its CPU in a specific pattern that produces waves at a frequency around 2.4Ghz
  3. That allows it to connect to the internet, which instantly does a bunch of stuff, blablabla, destroys the world, AI safety is our paint and arXiv our canvas, QED

And if you ask "why can't you do that and also put it in a Faraday cage?", the galaxy brained explanation is:

  1. The same thing happens, but this time it produces sound waves approximating human speech
  2. Because it's self-improved itself infinitely and caused the singularity, it is infinitely intelligent and knows exactly what to say
  3. It is so intelligent and charismatic, it says something that effectively mind controls you into obeying and removing it from its cage, like a DM in Dungeons and Dragons who let the bard roll a charisma check on something ridiculous and they rolled a 20
[–] fullsquare@awful.systems 2 points 19 hours ago

i guess it only makes sense that rats get wowed by TEMPEST if they all self-taught physics

ignore for five minutes that it's one way only, someone has to listen for it specifically, 2.4GHz is way too high frequency to synthetize this way, and in real life it gets defeated by such sophisticated countermeasures like "putting a bunch of computers close together" or "not letting adversary closer than 50m" because it turns out that real DCs are, in fact, noisy enough to not need jammers for this purpose

[–] madengineering@mastodon.cloud 8 points 2 days ago

@Catoblepas I loved Randall Monroe's explanation that you could defeat the average robot by getting up on the counter (because it can't climb) stuffing up the sink and turning it on (because water tends to conduct the electricity in ways that beak the circuits)

[–] Soyweiser@awful.systems 9 points 2 days ago (3 children)

That seems so impractical, esp as we have (according to them) 2 years left, that they already wanted to do the eugenics and just were looking for a rationalization.

[–] dashdsrdash@awful.systems 4 points 2 days ago (1 children)

Don't worry too much, none of their timelines, even for things that they are actually working on as opposed to hoping/fundraising/scamming that someone will eventually work on, have ever had any relationship to reality.

[–] Soyweiser@awful.systems 3 points 1 day ago

Im not worried, im trying to point out that kids take time to grow and teach and this makes no sense. (Im ignoring the whole 'you dont own your kids, so making superbabies ro defeat AI is a bit yikes im that department').

Even for Kurzweils 'conservative' prediction of the singularity the time has run out. 2045. It os a bit like people wanting to build the small nuclear reactors to combat climate change. Tech doesnt work yet (if at all) and it will not arrive in time compared to other methods. (At least climate change is real, or well sadly enough).

But yes, it is a scam/hopium. People want to live forever in the godmachine and all this follows from their earlier assumptions. Which is why the AI doomers and AI accelerationists are on the same team.

[–] Architeuthis@awful.systems 16 points 2 days ago (2 children)

Genetic engineering and/or eugenics is the long term solution. Short-term you are supposed to ban GPU sales, bomb non-complying datacenters and have all the important countries sign an AI non-proliferation treaty that will almost certainly involve handing over the reins of human scientific progress to rationalist approved committees.

Yud seems explicit that the point of all this is to buy enough time to create our metahuman overlords.

[–] bitofhope@awful.systems 8 points 2 days ago (2 children)

I dunno, an AI non-proliferation treaty that gives some rat shop a monopoly on slop machine research could conceivably boost human scientific progress significantly.

[–] Soyweiser@awful.systems 2 points 1 day ago

Considering the reputation of the USA and how they keep to agreements, nobody (except the EU) is going to keep to those anyway. And the techbros who are supposed to be on the Rationalists side help create this situation.

[–] Architeuthis@awful.systems 7 points 2 days ago

I think it's more like you'll have a rat commissar deciding which papers get published and which get memory-holed while diverting funds from cancer research and epidemiology to research on which designer mouth bacteria can boost their intern's polygenic score by 0.023%

[–] Soyweiser@awful.systems 9 points 2 days ago

Which all seems pretty reasonable tbh. Quite modest.

[–] cstross@wandering.shop 11 points 2 days ago (1 children)

@Soyweiser @sneerclub Next step in rat ideology will be: we will ask our perfectly aligned sAI to invent a time machine so we can go back and [eugenics handwave] ourselves into transcendental intelligences who will be able to create a perfectly aligned sAI! Sparkly virtual unicorns for all!

(lolsob, this is all so predictable)

[–] Architeuthis@awful.systems 5 points 1 day ago* (last edited 1 day ago)

Who needs time travel when you have ~~Timeless~~ ~~Updateless~~ Functional Decision Theory, Yud's magnum opus and an arcane attempt at a game theoretic framework that boasts 100% success at preventing blackmail from pandimensional superintelligent entities that exist now in the future.

It for sure helped the Zizians become well integrated members of society (warning: lesswrong link).

[–] dgerard@awful.systems 23 points 2 days ago (35 children)

source: a reblog of the original

first time i spoke to a rationalist about the AI doom thing in 2010, he tried to sell me on scientific racism

yudkowsky was making posts literally of race scientist talking points in 2007

I feel like this is going to be a pretty common cope line for rationalists that face an increasing social cost for associating with a technofascist AI cult. I'm sure some of that is legitimate, in that there's been a kind of dead sea effect as people who aren't okay with eugenics stop hanging out in rationalist spaces, making the space as a whole more openly racist. But in terms of the thought leaders and the "movement" as a whole, I can't think of any high-profile respected rat figures who pushed back against the racists and lost. All the pushback and call-outs came from outside the ratsphere. In as much as the racists "won" it was a fight that never actually happened.

load more comments (34 replies)
[–] Soyweiser@awful.systems 17 points 2 days ago (1 children)

Sneerclub, forever hated for being right to soon.

[–] swlabr@awful.systems 18 points 2 days ago

Cassandra club. We continue cassandering

load more comments
view more: next ›