this post was submitted on 01 Sep 2025
20 points (100.0% liked)

SneerClub

1187 readers
38 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

See our twin at Reddit

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Architeuthis@awful.systems 9 points 4 days ago* (last edited 4 days ago) (3 children)

This was an excellent read if you're aware of the emails but never bothered to read his citations or to dig into what the blather about object-level and meta-level problems was specifically about, which is presumably most people.

So, a deeper examination of the email paints 2014 Siskind as a pretty run of the mill race realist who's really into black genes are dumber, you guys studies and who thinks that higher education institutions not taking them seriously means they are deeply broken and untrustworthy, especially with anything to do with pushing back against racism and sexism. Oh, and he is also very worried that immigration may destroy the West, or at least he gently urges you to get up to speed with articles coincidentally pushing that angle, and draw your own conclusions based on pure reason.

Also it seems that in private he takes seriously stuff he has already debunked in public, which makes it basically impossible to ever take anything he writes in good faith.

[–] excerpta@zirk.us 7 points 4 days ago (3 children)

@Architeuthis @dgerard "...impossible to ever take anything he writes in good faith."

See also this unguarded moment from Tumblr. All the alpha is in bad faith social engineering!
https://www.reddit.com/r/SneerClub/comments/9lj3g7

If I am 30% of the way from socialist to libertarian, and all of my friends are 10% of the way from socialist to libertarian, I think it’s fair to tell my friends “No, look! Libertarians make some good points! We need to pay more attention to the way libertarians think instead of hating them and rejecting everything they say out of hand!” This doesn’t make me a libertarian - I’m still only 30% of the way from socialist to libertarian and so more on the socialist side...I thought I had an SSC post where I explained this further, but I can’t find it. The gist was that if everyone else is at 10% and you think the correct answer is 30%, you can either argue for 30 and have them compromise at 20%, or you can argue 50% and have them compromise at 30%. I’m not sure there’s a right answer to this question, but I sometimes end up arguing for 50% and I think this is at least a defensible choice.

[–] Architeuthis@awful.systems 10 points 4 days ago* (last edited 4 days ago)

I wonder if this is just a really clumsy attempt to invent stretching the overton window from first principles or if he really is so terminally rationalist that he thinks a political ideology is a sliding scale of fungible points and being 23.17% ancap can be a meaningful statement.

That the exchange of ideas between friends is supposed to work a bit like the principle of communicating vessels is a pretty weird assumption, too. Also, if he thinks it's ok to admit that he straight up tries to manipulate friends in this way, imagine how he approaches non-friends.

Between this and him casually admitting that he keeps "culture war" topics alive on the substack because they get a ton of clicks, it's a safe bet that he can't be thinking too highly of his readership, although I suspect there is an esoteric/exoteric teachings divide that is mostly non-obvious from the online perspective.

[–] AllNewTypeFace@leminal.space 8 points 4 days ago* (last edited 4 days ago)

So, by that token, if hypothetically you think that the Nazis got a few things right (not the war, racism or genocide, of course, or even the degenerate art, but maybe, say, the smoking bans and well-paved roads and perhaps the odd Wagnerian opera), the way to convince people is to start ranting about blood and soil and the need to exterminate the üntermenschen and wait for the nice normie liberals to politely meet you part of the way?

[–] CinnasVerses@awful.systems 6 points 4 days ago (1 children)

In his early blog posts, Scott Alexander talked about how he was not leaping through higher education in a single bound (he went overseas for medical school, and failed to get medical residency on his first try, ending up in a small Midwestern city). So I wonder why he is sure that in a world with fewer university degrees, he would have gotten as far as he did (medical schools in the USA used to limit admissions from people of his ethnicity).

Likewise with immigration restrictions: he knows that they often blocked Jews, many Europeans. and East Asians not just brown people right?

[–] Architeuthis@awful.systems 8 points 4 days ago* (last edited 4 days ago) (1 children)

In his early blog posts, Scott Alexander talked about how he was not leaping through higher education in a single bound

He starts his recent article on AI psychosis by mixing up psychosis with schizophrenia (he calls psychosis a biological disease), so that tracks.

Other than that, I think it's ok in principle to be ideologically opposed to something even if you and yours happened to benefit from it. Of course, it immediately becomes iffy if it's a mechanism for social mobility that you don't plan on replacing, since in that case you are basically advocating for pulling up the ladder behind you.

[–] fullsquare@awful.systems 7 points 4 days ago (1 children)

He starts his recent article on AI psychosis by mixing up psychosis with schizophrenia (he calls psychosis a biological disease), so that tracks.

wait, this man is a psychiatrist? or is that another scott

[–] CinnasVerses@awful.systems 4 points 4 days ago* (last edited 4 days ago) (1 children)

Yes, Scott Alexander is an unusual rationalist blogger who had a credentialed professional career as a psychiatrist. After Substack became his patron, he opened his own medical practice, but the website has said "not accepting new patients at this time" since 2022. So he seems to live off gifts from fellow travelers with a side hustle in psychiatry.

[–] fullsquare@awful.systems 7 points 4 days ago (3 children)

i'll risk a guess that running ritalin-dispenser-as-a-service type business catering to overly confident rationalists might get him a pretty penny

[–] bigfondue@lemmy.world 4 points 3 days ago

Reading his adderall article I couldn't help but think that this guy is handing scripts to everyone in the Bay Area

[–] CinnasVerses@awful.systems 5 points 4 days ago (2 children)

That is very possible although I would guess that was earlier in his career given that he does not advertise as treating ADHD or similar. He has two small children, a writing job, and side projects like writing end-of-the-world stories for AI 2027. His practice has a name drawn from Lord of the Rings like other things in the Thielsphere.

[–] dgerard@awful.systems 6 points 3 days ago (1 children)

His practice has a name drawn from Lord of the Rings like other things in the Thielsphere.

fuckin lol, I had not spotted this, what a tell

[–] swlabr@awful.systems 3 points 3 days ago

Fatty Lumpkin's Headshrinking and Sundry?

[–] fullsquare@awful.systems 4 points 4 days ago (2 children)

he had a blogpost about how amphetamines risks are overstated and it's fine actually for more people than usually prescribed https://slatestarcodex.com/2017/12/28/adderall-risks-much-more-than-you-wanted-to-know/

[–] dgerard@awful.systems 5 points 3 days ago* (last edited 3 days ago) (1 children)

this post was the starting pistol for rationalists taking as much adderall as they could get down their necks. Scott is not telling you that adderall will make you a financial genius and super effective, you understand. Except Kelsey Piper, who he literally says this about by name.

This is the post that made it a rationalist commonplace that adderall makes anyone a super effective financial genius.

it's what got TPOT losers "microdosing" street meth. Of course the same TPOT confused meth and MDMA.

Somehow, Scott still has a license.

[–] fullsquare@awful.systems 4 points 3 days ago

ye, who are we to doubt superpredictors like them

[–] CinnasVerses@awful.systems 2 points 3 days ago* (last edited 3 days ago) (1 children)

Making general statements about the risks and benefits of medication is different from proscribing them. The George K. Lerner, MD who was FTX's resident pill-pusher seems to be based in San Francisco and wants potential patients to know that inter alia "Dr. Lerner specializes in the treatment of Attention Deficit Disorder (ADD/ADHD) in adults. He has extensive experience in treating adults who have been successful in their professional endeavors but have found attention deficit symptoms to be an impediment to achieving their full potential." (nudge nudge)

His website does not mention a connection with the hospital in Michigan which is the only one where I know Alexander worked. I would like to know more about possible connections other than their mutual connections to the FTX gang. I have not done shoe-leather reporting in SoCal and almost all of the things we know about Alexander are things he posted voluntarily under his main handle.

Lerner's site shows what Alexander's site might look like if he were focused on psychiatry rather than writing and peddling racist lies.

[–] fullsquare@awful.systems 4 points 3 days ago

scott also has an explainer article for stimulants for ADHD where he tells that:

[...] This matches my experience. I’ve worked with a few hundred Adderall patients

so maybe he doesn't have to advertise lot, or at all https://lorienpsych.com/2020/10/30/adderall/

[–] dgerard@awful.systems 4 points 4 days ago

but look, i liked this article,