17
you are viewing a single comment's thread
view the rest of the comments
[-] swlabr@awful.systems 13 points 1 year ago* (last edited 1 year ago)

Tangent to your point- what would happen if we started misusing tescreal terms to dilute their meaning? Some ideas:

“I don’t want to go to that party. It’s an x-risk.”

“No, I didn’t really like those sequel films. They were inscrutable Matrices.”

“You know, holding down the A button and never letting up is a viable strategy as long as you know how to brake and mini-turbo in Mario Kart. Look up ‘effective accelerationism’.”

Anyway I doubt it would do anything other than give us a headache from observing/using rat terms. Just wanted to have a lil fun.

[-] Amoeba_Girl@awful.systems 11 points 1 year ago

i'll definitely start using "existential risk" for any minor inconvenience, thank you

[-] self@awful.systems 10 points 1 year ago

there’s significant x-risk in my need to clean my espresso machine conflicting with my extreme laziness preventing me from doing so

[-] Soyweiser@awful.systems 5 points 1 year ago* (last edited 1 year ago)

If you think you are the only real human alive, all risks are existential. If you die they shut down the simulation. This is why Musk will never fly in one of his own rockets. And my bytes thank him for it.

[-] log@mastodon.sdf.org 6 points 1 year ago

@Soyweiser @Amoeba_Girl Any sim-solipsist worth their processing time would know that even if one instance dies, certain calculations might be memoized and reused in other instances. If the rocket blows up, they can just reuse that sequence on another Musk sim if his rocket blows up, too. If you're important enough to be the sole protagonist, why not be important enough to have a billion instances of yourself running concurrently in variant simulations?

[-] Soyweiser@awful.systems 4 points 1 year ago

But are those copies really you? They are copies after all, and eventually your instance might hit a dead branch in which all the actions lead to death and the only non-dead branch of you might be so diverse in different choices it made it can no longer be considered you. That is simply not a risk I can take.

"This message was send from my padded cell"

[-] log@mastodon.sdf.org 5 points 1 year ago

@Soyweiser As long as they are enough like me to still be better than everyone else, they pass the narcissism filter. All those billions will eventually have to fail somehow anyway, to determine the grand champion best possible me that will be copied the most for the next round.

By my calculations, the red light prolonged my commute by 3 minutes, thus costing approximately 54 billion lives.

this post was submitted on 26 Oct 2023
17 points (100.0% liked)

SneerClub

983 readers
9 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS