1
you are viewing a single comment's thread
view the rest of the comments
[-] saucerwizard@awful.systems 1 points 1 year ago
[-] TinyTimmyTokyo@awful.systems 1 points 1 year ago

That reminds me. If the world is about to FOOM into a kill-all-humans doomscape, why is he wasting time worrying about seed oils?

A lot of rationalism is just an intense fear of death. Simulation hypothesis? Means that maybe you can live forever if you're lucky. Superintelligence? Means that your robot god might grant you immortality someday. Cryogenics? Means that there's some microscopic chance that even if you pass away you could be revived in the future at some point. Long terminism? Nothing besides maybe someday possibly making me immortal could possibly matter.

I mean don't get me wrong I'd give a lot for immortality, but I try to uhh... stay grounded in reality.

[-] davidemmerson@mastodon.me.uk 0 points 1 year ago

@sailor_sega_saturn @cstross I”m not sure that those things are “Rationalism”. Rationalism is about reason above religious belief, not substituting one god with another..

[-] self@awful.systems 1 points 1 year ago

don’t worry, folk like Yudkowsky have already taken care of substituting one god with another under rationalism’s name in an attempt to grift atheists who miss the comfort of an afterlife to look forward to

you can thank him by donating a tithe of all your money to effective altruism or whichever “AI alignment” org will save you from the Basilisk these days

this post was submitted on 21 Aug 2023
1 points (100.0% liked)

SneerClub

1003 readers
4 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 2 years ago
MODERATORS