2
Yudkowsky advises his fellow Effective Altruists to take the FTX money and run. For the sake of charity, you understand.
(forum.effectivealtruism.org)
Hurling ordure at the TREACLES, especially those closely related to LessWrong.
AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)
This is sneer club, not debate club. Unless it's amusing debate.
[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]
how have awful nerd writers like Yud not realized that D&D alignments are barely serviceable as a storytelling mechanism, much less an ethical framework? I keep seeing the alignment chart seriously mentioned as if it were an irrefutable aspect of human nature, but it was written as a gameplay mechanic (for spells/prayers that care if the caster or target is good or evil) and falls apart under the lightest scrutiny, as a lot of DMs and D&D writers have noticed. why is this still a thought-terminating cliche in nerd culture circles?
Spoken like a true Lawful Good weenie.
As a Chaotic Neutral INTJ Gray Tribe Ravenclaw Scorpio the DnD alignment system works great for analyzing behavior in hunter-gatherer societies and therefore ours.
You didn’t even declare your ruleset version….? G’damn casuals 🙄
(/s, of course)