17

Some light sneerclub content in these dark times.

Eliezer complements Musk on the creation of community notes. (A project which predates the takeover of twitter by a couple of years (see the join date: https://twitter.com/CommunityNotes )).

In reaction Musk admits he never read HPMOR and he suggests a watered down Turing test involving HPMOR.

Eliezer invents HPMOR wireheads in reaction to this.

you are viewing a single comment's thread
view the rest of the comments

I'm reminded of a My Little Pony singularity fan-fiction (Friendship is Optimal) that I read back when I had poor taste. An AI for a pony MMORPG goes rogue and converts everyone into digital ponies to maximize happiness but with a pony theme. The victims live out impossibly long, but ultimately superficial, lives doing pony stuff and goodness gracious why is there such a weird relationship between rationalists and fanfiction writers.

[-] swlabr@awful.systems 11 points 1 year ago

most charitable psychoanalysis: projecting their sense of rationality onto a fictional world is a way to express a deep longing for rules and logic in an often cruelly irrational world

least charitable: their sense of rationality can only be true in a fictional world, so they want to live in that rather than reality

Neutral charity: the author is dead, all interpretation is essentially fanfiction, and since we are all individuals, all relationships with texts/fanfiction are weird.

[-] dgerard@awful.systems 10 points 1 year ago

doing pony stuff

the most euphemistic description yet of the cursed slab of ponyfucking

[-] blakestacey@awful.systems 4 points 1 year ago

"I dig a pony ... Well, you can penetrate any place you go / Yes, you can penetrate any place you go / I told you so"

[-] corbin@awful.systems 8 points 1 year ago

It's the combination of big imaginations and little real-world experience. In Friendship is Optimal, the AGI goes from asking for more CPUs to asking for information on how to manufacture its own CPUs, somehow without involving the acquisition of silicon crystals or ASML hardware along the way. Rationalist writers imagine that AGI will somehow provide its own bounty of input resources, rather than participating in the existing resource economy.

In reality, none of our robots have demonstrated the sheer instrumentality required to even approach this sort of doomsday scenario. I think rationalists have a bit of the capitalist blind spot here, imagining that everything and everybody (and everypony!) is a resource.

[-] sue_me_please@awful.systems 7 points 1 year ago

Whatever, I'll be a pony. Where do I sign up?

[-] swlabr@awful.systems 10 points 1 year ago

Pleasure Island, from Pinocchio. You gotta ask for the pony pass though, or else you’re just gonna get turned into a donkey. To reverse the transformation you gotta go to the island of Dr. Moreau.

[-] maol@awful.systems 6 points 1 year ago

they're both extremely online. next question

this post was submitted on 15 Oct 2023
17 points (100.0% liked)

SneerClub

1003 readers
2 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 2 years ago
MODERATORS