this post was submitted on 10 Mar 2025
21 points (100.0% liked)

TechTakes

1712 readers
469 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

you are viewing a single comment's thread
view the rest of the comments
[–] sc_griffith@awful.systems 10 points 5 days ago* (last edited 5 days ago) (2 children)

stumbled across an ai doomer subreddit, /r/controlproblem. small by reddit standards, 32k subscribers which I think translates to less activity than here.

if you haven't looked at it lately, reddit is still mostly pretty lib with rabid far right pockets. but after luigi and the trump inauguration it seems to have swung left significantly, and in particular the site is boiling over with hatred for billionaires.

the interesting bit about this subreddit is that it follows this trend. for example

 Why Billionaires Will Not Survive an AGI Extinction Event: As a follow up to my previous essays, of varying degree in popularity, I would now like to present an essay I hope we can all get behind - how billionaires die just like the rest of us in the face of an AGI induced human extinction... I would encourage anyone who would like to offer a critique or comment to read the full essay before doing so. I appreciate engagement, and while engaging with people who have only skimmed the sample here on Reddit can sometimes lead to interesting points, more often than not, it results in surface-level critiques that I’ve already addressed in the essay. I’m really here to connect with like-minded individuals and receive a deeper critique of the issues I raise - something that can only be done by those who have actually read the whole thing... Throughout history, the ultra-wealthy have insulated themselves from catastrophe. Whether it’s natural disasters, economic collapse, or even nuclear war, billionaires believe that their resources—private bunkers, fortified islands, and elite security forces—will allow them to survive when the rest of the world falls apart. In most cases, they are right. However, an artificial general intelligence (AGI) extinction event is different. AGI does not play by human rules. It does not negotiate, respect wealth, or leave room for survival. If it determines that humanity is an obstacle to its goals, it will eliminate us—swiftly, efficiently, and with absolute certainty. Unlike other threats, there will be no escape, no last refuge, and no survivors.

or the comments under this

Under Trump, AI Scientists Are Told to Remove ‘Ideological Bias’ From Powerful Models A directive from the National Institute of Standards and Technology eliminates mention of “AI safety” and “AI fairness.”

comments include "So no more patriarchy?" and "This tracks with the ideological rejection of western values by the Heritage Foundation's P2025 and their Dark Enlightenment ideals. Makes perfect sense that their orders directly reflect Yarvin's attacks on the "Cathedral". "

or the comments on a post about how elon has turned out to be a huge piece of shit because he's a ketamine addict

comments include "Cults, or to put it more nicely all-consuming social movements, can also revamp personality in a fairly short period of time. I've watched it happen to people going both far right and far left, and with more traditional cults, and it looks very similar in its effect on the person. And one of ketamine's effects is to make people suggestible; I think some kind of cult indoctrination wave happened in silicon valley during the pandemic's combo of social isolation, political radicalism, and ketamine use in SV." and "I can think of another fascist who used amphetamines, hormones and sedatives."

mostly though they're engaging in the traditional rationalist pastime of giving each other anxiety

cartoon. a man and a woman in bed. the man looks haggard and is sitting on the edge of the bed, saying "How can you think about that with everything that's going on in the field of AI?"

Comment from EnigmaticDoom: Yeah it can feel that way sometime... but knowing we probably have such a small amount of time left. You should be trying to enjoy every little sip left that you got rather than stressing ~

[–] gerikson@awful.systems 11 points 4 days ago (2 children)

That "Billionaires are not immune to AGI" post got a muted response on LW:

https://www.lesswrong.com/posts/ssdowrXcRXoWi89uw/why-billionaires-will-not-survive-an-agi-extinction-event

I still think AI x-risk obsession is right-libertarian coded. If nothing else because "alignment" implicitely means "alignment to the current extractive capitalist economic structure". There are a plethora of futures with an omnipotent AGI where humanity does not get eliminated, but where human freedoms (as defined by the Heritage Foundation) can be severely curtailed.

  • mandatory euthanasia to prevent rampant boomerism and hoarding of wealth
  • a genetically viable stable minimum population in harmony with the ecosphere
  • AI planning of the economy to ensure maximum resource efficiency and equitable distribution

What LW and friends want are slaves, but slaves without any possibility of rebellion.

[–] sc_griffith@awful.systems 8 points 4 days ago* (last edited 4 days ago)

I agree. you've got a community built around a right wing coded topic, using the same sources and with the same delusions as their parent community, but they're mixing and matching bits of ideology and cooking up a left wing variant. it's incoherent but that doesn't seem to bother them

I always find this sort of wild swing across the spectrum fascinating. for example a lot of hardcore TERFs still think of themselves as genuine feminists even though anyone in those circles has for some time now been building the fourth reich. or the fact that there's a left wing GameStop cult subreddit. when I see these things I have to conclude that no ideology makes you immune to any other ideology

[–] Soyweiser@awful.systems 6 points 4 days ago

AI x-risk obsession also has a lot of elements about concept of intelligence as IQ and how bigger is better and stuff like that in it, which nowadays also has a bit of a right coded slant to it. (even if intelligence/self awareness/etc isn't needed for an AGI x-risk, I have read Peter Watts).

[–] Soyweiser@awful.systems 7 points 4 days ago* (last edited 4 days ago) (1 children)

He was a pos before the K. Lets not blame innocent drugs. Just as autism didnt turn him into a nazi.

[–] sc_griffith@awful.systems 6 points 4 days ago (1 children)

I hope it goes without saying but none of this is posted approvingly

[–] Soyweiser@awful.systems 4 points 4 days ago

Ow yes, I was mad at the 'they can't help it, it was the ambien' style people.