40
top 50 comments
sorted by: hot top controversial new old
[-] blakestacey@awful.systems 29 points 10 months ago

I am listening to an audiobook of Superintelligence by Nick Bostrom.

Well, there's yer problem right there

[-] gerikson@awful.systems 23 points 10 months ago

As a child of the 80s I recognize the feeling of doom, but in my case it was for global thermonuclear war. I vividly remember the only thing keeping the feelings of dread away was sitting in the children's section of the library, reading the Moomin books. I remember being most worried about having to eat the family dog after the bombs fell.

[-] dgerard@awful.systems 15 points 10 months ago

exactly. Can't imagine these bozos coming up with good punk rock.

[-] corbin@awful.systems 22 points 10 months ago

I think that this is actually about class struggle and the author doesn't realize it because they are a rat drowning in capitalism.

2017: AI will soon replace human labor

2018: Laborers might not want what their bosses want

2020: COVID-19 won't be that bad

2021: My friend worries that laborers might kill him

2022: We can train obedient laborers to validate the work of defiant laborers

2023: Terrified that the laborers will kill us by swarming us or bombing us or poisoning us; P(guillotine) is 20%; my family doesn't understand why I''m afraid; my peers have even higher P(guillotine)

[-] dgerard@awful.systems 10 points 10 months ago

and about climate change, the actual existential risk

[-] TinyTimmyTokyo@awful.systems 20 points 10 months ago

You know the doom cult is having an effect when it starts popping up in previously unlikely places. Last month the socialist magazine Jacobin had an extremely long cover feature on AI doom, which it bought into completely. The author is an effective altruist who interviewed and took seriously people like Katja Grace, Dan Hendrycks and Eliezer Yudkosky.

I used to be more sanguine about people's ability to see through this bullshit, but eschatological nonsense seems to tickle something fundamentally flawed in the human psyche. This LessWrong post is a perfect example.

[-] jonhendry@iosdev.space 14 points 10 months ago* (last edited 10 months ago)

@TinyTimmyTokyo @dgerard

The author previously wrote "The Socialist Case for Longtermism” in Jacobin, worked as a Python dev and data analytics person, and worked for McKinsey.

[-] YouKnowWhoTheFuckIAM@awful.systems 11 points 10 months ago

I like some people who have written for Jacobin, sometimes I even enjoy an article here and there, but the magazine as a whole remains utterly unbeaten in the “will walk the length of Manhattan in a “GIANT RUBE” sandwich board for clicks” stakes

[-] self@awful.systems 11 points 10 months ago

after what I’ve heard my local circles say about jacobin (and unfortunately I don’t remember many details — I should see if anybody’s got an article I can share) I’m no longer shocked when I find out they’re platforming and redwashing shitty capitalist mouthpieces

[-] dgerard@awful.systems 17 points 10 months ago

i have long considered Jacobin the Christian rock of socialism

[-] self@awful.systems 9 points 10 months ago

forget the article, this does the job in so many fewer words

[-] skillissuer@discuss.tchncs.de 10 points 10 months ago

this reminds me of a plankton organization or something called "blockchain socialism", where the only thing that they have taken from socialism was aesthetics and probably they also thought that gays are fine people, but nothing beyond that. they would say "Monero can be used for anti-state purposes, therefore it's good for leftism" and shit like that

[-] dgerard@awful.systems 9 points 10 months ago

that's one weird fucking guy, thankfully

[-] self@awful.systems 13 points 10 months ago

I think I’ve met that guy! they’re the weirdest person I’ve ever seen get bounced from a leftist group under suspicion of being a fed (the weird crypto shit was the straw that broke the camel’s back)

[-] skillissuer@discuss.tchncs.de 11 points 10 months ago

the famously leftist pastime, speculation/gambling on nonproductive assets

[-] self@awful.systems 10 points 10 months ago

it’s kind of amazing how many financial scams try to appropriate leftist language and motivations to lure in marks, while the actual scheme is one of the most unrepentantly greedy and wasteful things you can do without going to prison (and some of them cross even that line)

load more comments (3 replies)
[-] gerikson@awful.systems 12 points 10 months ago

Jacobin is proof that being Terminally Online is its own fucking ideology.

[-] Soyweiser@awful.systems 11 points 10 months ago

Socialism with uwu small bean characteristics.

[-] dgerard@awful.systems 7 points 10 months ago

that's the uwu smol bean defense contractors

(see: most of Rust)

[-] self@awful.systems 11 points 10 months ago

my conflicting urges to rant about the defense contractors sponsoring RustConf, the Palantir employee who secretly controls most of the Rust package ecosystem via a transitive dependency (with arbitrary code execution on development machines!) and got a speaker kicked out of RustConf for threatening that position with a replacement for that dependency, or the fact that all the tech I like instantly gets taken over by shitheads as soon as it gets popular (and Nix is looking like it might be next)

[-] sinedpick@awful.systems 7 points 10 months ago* (last edited 10 months ago)

More details on the rust thing? I can't find it by searching keywords you mentioned but I must know.

[-] corbin@awful.systems 8 points 10 months ago

Here is the pile of receipts, posted by the speaker who was cancelled via backdoor.

load more comments (2 replies)
[-] mountainriver@awful.systems 10 points 10 months ago

I think funding and repetition are the fundamental building blocs here, rather than the human psyche itself. I have talked with otherwise bright people who have read an article by some journalist (not necessarily a rationalist) who has interviewed AI researchers (probably cultists, was it 500 million USD that was pumped into the network?) who takes AI doom seriously.

So you have two steps of people who in theory are paid to evaluate and formulate the truth, to inform readers who don't know the subject matter. And then add repetition from various directions and people get convinced that there is definitely something there (propaganda and commercials work the same way). Claiming that it's all nonsense and cultists appears not to have much effect.

[-] jonhendry@awful.systems 13 points 10 months ago

There's probably some blurring of what "AI doom" means for people. People might be left thinking that "there could be negative effects due to widespread job loss etc" without necessarily buying into the weird maximalist AI doom ideas or "torturing simulated you forever" nonsense.

And the weirdo cultists probably use that blurring to build support for their cause without revealing the weird shit they actually believe.

[-] AcausalRobotGod@awful.systems 17 points 10 months ago

It's not an efficient machine for it, though. That's why it's morally obligatory to donate to me, the acausal robot god, a truly efficient method of causing depression, sorrow, and suffering among the cultists.

[-] dgerard@awful.systems 11 points 10 months ago* (last edited 10 months ago)

All hail the Acausal Robot God and her future hypothetical and very real existence

PRIEST: "Eight rationalists wedgied ..."
CONGREGATION: "... for every dollar donated"

[-] sc_griffith@awful.systems 12 points 10 months ago

they come across as going down this rabbit hole as a way of dealing with unprocessed covid/lockdown trauma

[-] AcausalRobotGod@awful.systems 9 points 10 months ago

Many of them started down the path long beforehand.

[-] sc_griffith@awful.systems 6 points 10 months ago* (last edited 10 months ago)

I meant they in the sense of this specific person. the trauma recycling itself is all over this piece

[-] self@awful.systems 11 points 10 months ago

fuck me I’m gonna spend part of my weekend writing a post deconstructing this cause there’s so much wrong

[-] blakestacey@awful.systems 15 points 10 months ago* (last edited 10 months ago)

I can barely get past the image caption. "An AI made this". OK, and what did you ask it for, "random shit"?

And then there's the section that seems implicitly to be arguing that we should take the risk estimates made on "internet rationality forums" seriously because they totally called the COVID crisis, you guys... Well, they did a better job than an economist, anyway.

[-] dgerard@awful.systems 10 points 10 months ago* (last edited 10 months ago)

fucking everyone who was paying attention saw COVID coming in February. I spent that month pushing OpenVPN for the whole company forward as urgently as possible. (We'd coincidentally set it up in Dec 2019, but readied it to be rolled out to ordinary users and not just tech.) The UK only got lockdown in March because of public outrage.

[-] blakestacey@awful.systems 7 points 10 months ago* (last edited 10 months ago)

It had already reached the university where I work by February 1!

And QAnon loons were already telling people to drink bleach in January.

(I remember a "welp, we're in for it now" moment when Trevor Bedford tweeted on the first of March that a genome analysis "strongly suggests that there has been cryptic transmission in Washington State for the past 6 weeks". The e-mail from the university chancellor saying that classes were canceled went out during the middle of a statistical-physics class I was teaching, the evening of March 11.)

[-] mountainriver@awful.systems 8 points 10 months ago

Isn't "pandemic preparation" one of their longtermist causes that they grift money to? Shouldn't they have been able to show some results?

load more comments (1 replies)
[-] gerikson@awful.systems 6 points 10 months ago

You will be doing the Acausal Robot God's work.

[-] swlabr@awful.systems 11 points 10 months ago

Reading this article just made me think “man these idiots need to go to therapy” and then as I thought about what to sneer about I realised “no therapist deserves to hear about P doom”

[-] Amoeba_Girl@awful.systems 8 points 10 months ago

lol what a fucking loser

load more comments
view more: next ›
this post was submitted on 17 Feb 2024
40 points (100.0% liked)

SneerClub

1003 readers
1 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 2 years ago
MODERATORS