863

Art by smbc-comics

Consciousness is often said to disappear in deep, dreamless sleep. We argue that this assumption is oversimplified. Unless dreamless sleep is defined as unconscious from the outset there are good empirical and theoretical reasons for saying that a range of different types of sleep experience, some of which are distinct from dreaming, can occur in all stages of sleep.

Pubmed Articles

Does Consciousness Disappear in Dreamless Sleep?

Sciencealert Article We Were Wrong About Consciousness Disappearing in Dreamless Sleep, Say Scientists

you are viewing a single comment's thread
view the rest of the comments
[-] ArbitraryValue@sh.itjust.works 16 points 1 year ago* (last edited 1 year ago)

A human being is a process of computation. Ending the computation is death. Pausing the computation is, well, simply pausing the computation. It has no profound significance.

(This is also my answer to the "teleporter problem." As long as the computation continues, a change in the substrate on which it takes place also has no profound significance.)

[-] query@lemmy.world 15 points 1 year ago

And if the teleportation process doesn't terminate the original, but creates a copy on the other end, are they both the same person?

[-] ArbitraryValue@sh.itjust.works 5 points 1 year ago* (last edited 1 year ago)

Creating and destroying perfectly identical copies of the information that corresponds to a person neither creates nor destroys people unless the very last copy of that information is destroyed, in which case the person is killed.

Small divergences aren't a big deal. For example, if a person spends an hour under the effect of an anesthetic (or alcohol) which prevents the formation of new long-term memories, this person isn't dying when he goes to sleep and wakes up without any memories of that last hour.

Larger divergences are a big deal - losing a year of memories is pretty bad, losing a decade is even worse, and having one's mind returned to the blank slate of an infant is very close to the same thing as dying.

So what I'm saying is that the two copies start out as the same person and then gradually become different people.

[-] doofy77@aussie.zone 3 points 1 year ago

You've been watching Farscape.

[-] Zorque@kbin.social 1 points 1 year ago

I doubled her... twinned her!

Definitely a fucky-brain episode.

[-] Anduin1357@lemmy.world 2 points 1 year ago

I would argue that two disconnected copies of the information that corresponds to a person does make 2 disjoint persons.

Like running a different seed on procedural generation, entropy will ensure that these two identical persons won't be identical after whatever ticks in the biological clock.

[-] ArbitraryValue@sh.itjust.works 1 points 1 year ago* (last edited 1 year ago)

I agree that the copies will diverge almost instantly; I'm just saying that small amounts of divergence aren't a big deal. That's what I'm trying to illustrate with my example of the person who loses an hour of memories. I think this is exactly equivalent to making a copy, having that copy exist for an hour, and then destroying it. An hour of memories does make the copy different from the original, but the loss of the copy is just the loss of that hour, not of a complete human being (and we naturally quickly forget much more than that - I already can't remember what I did every hour yesterday).

I admit I don't feel like it's exactly equivalent, but I think that's an illusion caused by my moral intuitions developing in a wold where destroying a copy always means destroying the only copy.

[-] Anduin1357@lemmy.world 1 points 1 year ago

Though the simpler solution is that perhaps memory formation is paused over the period then the person 'lost' their memory to sleep.

Losing memories when you're wide awake is like a file system deleting pointers to a file. The file is still there, just inaccessible.

Anyways I feel that the assertion that "Creating and destroying perfectly identical copies of the information that corresponds to a person neither creates nor destroys people" is extremely dangerous thinking that could lead to the premature end of consciousness for some very unfortunate individuals. After all, they're perfectly identical and we have no documented instance of anyone sharing consciousness, so it may be that consciousness are unique and not commutative.

[-] jarfil@lemmy.world 2 points 1 year ago

If it creates a copy, then it isn't teleportation, it's copying. Two copies will diverge from the moment they're no longer a single copy.

[-] ech@lemm.ee 6 points 1 year ago

a change in the substrate on which it takes place also has no profound significance.

It does to the person being "deleted".

[-] Carnelian@lemmy.world 5 points 1 year ago

Have you played SOMA? Fun game, gets into this exact type of thing

[-] massive_bereavement@kbin.social 5 points 1 year ago

SOMA is one of my favorite gaming experiences and probably one of the best sci-fi stories in this medium.

Saddly some monster bits were a bit weaker and I think Amnesia fans felt it didn't match their expectations..

[-] NegativeInf@lemmy.world 4 points 1 year ago

It feels like dreaming is the "training from a batch of sample memories" tactic from deep learning.

this post was submitted on 04 Sep 2023
863 points (96.1% liked)

Showerthoughts

30023 readers
866 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. A showerthought should offer a unique perspective on an ordinary part of life.

Rules

  1. All posts must be showerthoughts
  2. The entire showerthought must be in the title
  3. Avoid politics
    • 3.1) NEW RULE as of 5 Nov 2024, trying it out
    • 3.2) Political posts often end up being circle jerks (not offering unique perspective) or enflaming (too much work for mods).
    • 3.3) Try c/politicaldiscussion, volunteer as a mod here, or start your own community.
  4. Posts must be original/unique
  5. Adhere to Lemmy's Code of Conduct

founded 2 years ago
MODERATORS