• @AgreeableLandscape
    link
    4
    edit-2
    3 years ago

    The biggest problem is that “backing up” or “uploading” a brain to a computer is not “transferring” it. If you back up your brain and then your own brain dies, you won’t be in the computer, you’ll just be dead. A replica that thinks like you will be in the computer.

    • @DrivingForce
      link
      33 years ago

      Sometimes I think society conceptualize’s individuality and consciousness wrong. We are being recreated by quantum mechanics all the time. Why do we consider ourselves one person continuously with that in mind. What is so different about uploading to a computer? Is the you in the computer not the same you as the one in the flesh?

      • @AgreeableLandscape
        link
        23 years ago

        Thing is, it’s impossible to know if your consciousness will remain after a process or if you will die and a new consciousness that thinks like you created until you try it. If you continue existing, then great, but if you are replaced by a replica, anyone observing you has no way of knowing because the replica will think that it existed prior to its creation.

        • @DrivingForce
          link
          13 years ago

          If it has full memories of existing before it’s body was created then it is you? I would say think so. A perfect copy of me is just as much me as I am. A digital upload is just a less than perfect copy of a flesh substrate.

          • @AgreeableLandscape
            link
            2
            edit-2
            3 years ago

            My point is that you won’t know for sure until the process is done and you either wake up in the computer or you never experience anything again (or end up in wherever consciousness goes after death, you can’t be sure of that either until you actually die). That’s what’s terrifying about it.

            Also, an upload creates a copy, it does not cleanly transfer something, so which copy is the real you? If flesh and blood you and computer you exist simultaneously and experience different things, the two minds will diverge. Then which is the real you?

            • @DrivingForce
              link
              23 years ago

              Both copies are the real me in the same way a cloned git repo is the same project. If they diverge but still consider themselves the same project, they are in fact the same project. They would just have some differences on top of a base sameness.

              That is how I think of it anyway.

              • @abbenm
                link
                23 years ago

                Exactly. There’s a sense in which they are structurally the same, a sense in which you can say their identity is the same or very similar, but they are different instances. And in terms of instance they are easy to distinguish, and that is sufficient to dispense with many philosophical riddles that depend on asking which copy is “real”.

            • @abbenm
              link
              2
              edit-2
              3 years ago

              I think the great thing you’ve identified is that, even if you really did go dark, you wouldn’t be able to say so, and the copy would say everything was fine, because they inherited all your memories. Unless, say, we could do something fancy with the reconstructed copy to purposely change it in a deliberate way, like put a watermark over their vision saying “You are the copy!”

              But, as I said before, in practical terms, I think you really can know if you were transferred in many cases, given what we do know.

              For instance, for one common example, there’s the thing about a hypothetical teleporter that disintegrates you and rebuilds a copy of you somewhere else. In that case, you go dark, and it’s your copy that’s somewhere else.

              For a more vulgar example, if a perfect clone of me was made on the other side of the world, and I jumped into a volcano, it’s pretty clear that I wouldn’t experience a “transfer” of myself to that new body.

              And these aren’t frivolous examples, by the way. A lot of the talk of being “copied” contemplates variations on the theme of dispensing with the original and making a copy somewhere else, and the answer to whether “you” are still there comes from a mundane examination of the practical details of whatever the method is of copying.

              Presumably, if we got to the point of being able to copy people, we might hopefully know enough about how actual transference works and it (hopefully) wouldn’t be an open question, and our answer would depend on all kinds of boring structural details about how consciousness works.

              Anyway, long story short, I think we can chip away at a lot of the mystery with boringly practical answers in a way that steers clear of any encounter with a profound philosophical mystery. And while we shouldn’t be so cavalier about throwing away the Big Question (is it The Real You), I think a disposition for plain answers that focus on boring details probably helps steer clear of traps that people get stuck in.

              Also, an upload creates a copy, it does not cleanly transfer something, so which copy is the real you? If flesh and blood you and computer you exist simultaneously and experience different things, the two minds will diverge. Then which is the real you?

              I 100% agree. I think that’s exactly one of the traps that people get stuck in. If you exist side by side, your experiences won’t be exactly parallel. You can meaningfully answer by distinguishing between instances, at least, and noting the continuity of consciousness across time that one enjoys, while the other was recently poofed into existence. The fleshy one, that’s you. Done!

              But here’s what I don’t get about the path people take in these philosophical jeremiads. What’s with the ending, in paralyzed fear by a question? People seem to go there, or even want to go there, like it’s the goal of the whole conversation. I’ve seen this so many times. Ending with “who is the REAL you?” or “how do you decide?” Not that those are illegitimate questions, but… it’s like you’re trying to psyche someone out by trying to entrap them in a Chris Angel MindFreak, and using the tools of philosophy to find a way to get into that particularly panicked headspace. That should not be the goal. I think those are merely pit stops on the way to a continued examination of the question, not end points, and I think some people treat them as endpoints because they value the experience of being in that headspace, which I don’t get.

          • @abbenm
            link
            2
            edit-2
            3 years ago

            If it has full memories of existing before it’s body was created then it is you? I would say think so.

            No, because it’s a difference instance. From the copies perspective, it’s you and always was. From your perspective, you “go dark.” The copy thinks it’s “you”, as in the same, uninterrupted instance that you are. It just happens to be mistaken. It’s identical except for the fact that its a separate instance of you.

            A perfect copy of me is just as much me as I am.

            In a sense, I think that’s fair. It thinks its you, it feels like it always was you, and I think it’s fair to take some satisfaction that, if you could have that copy, at least to that copy if feels like it was always there.

            A digital upload is just a less than perfect copy of a flesh substrate.

            Wait, why? That may or may not be the case, but it would be due to reasons that have to due with complicated science about structural and chemical properties of those respective systems, and investigating whether the differences between them make a difference that matters.

            I can do morse code on a sheet of paper or etch it into a stone tablet, and there’s a meaningful sense in which the code is still the same.

            • @DrivingForce
              link
              13 years ago

              Wait, why? That may or may not be the case, but it would be due to reasons that have to due with complicated science about structural and chemical properties of those respective systems, and investigating whether the differences between them make a difference that matters.

              I think it is reasonable that the chemicals, bacteria and viruses in our body have an affect on who we are and what we act like. A computer (excluding super advanced ones like the “modern” times Orion’s Arm universe would have) would have a different affect on you than what I mentioned previously. While your code on the two elements are the same, one program compiled separately for two separate architectures would not be. The two binaries are less than perfect copies with ever so slight changes in how they behave.

              On another note: A perfect copy of me would act on my behalf and I the copy. I don’t think there would be any meaningful distinction between the two of me unless I was separated from the other instance for a long time and had wildly different environments. If one body dies I am still alive because of the other me. The above would apply to less than perfect copies too until it reaches a threshold where I and the copy were not acting in each other’s behalf.

              • @abbenm
                link
                2
                edit-2
                3 years ago

                I think it is reasonable that the chemicals, bacteria and viruses in our body have an affect on who we are and what we act like.

                I understand that, but as I said, that would be due to reasons that have to do with complicated science about structural and chemical properties of those respective systems, and investigating whether the differences between them make a difference that matters. Do clay tablets introduce copying errors you don’t have when you write down morse code on pen and paper? Do they need some assembly-level code in the beginning because they get input into a stone tablet reading machine? Do the totality of those changes change the meaning or essense of the message as it existed on pen and paper?

                A computer is different from biology, of course, but it can in principle model the effects of biology, if we decide that we need those to fully represent a mind that we want to call equivalent to the “real” thing. So we wouldn’t just simulate a brain, but a brain as it would exist in a body with its environmental inputs. Unless a computer can’t model those? Which I would probably disagree with, but there is a fair argument to be had there. But the point is, I don’t think complexity of interacting environments gets you out of the space of things that can in principle be represented on a computer.

                A perfect copy of me would act on my behalf and I the copy. I don’t think there would be any meaningful distinction between the two of me unless I was separated from the other instance for a long time and had wildly different environments.

                It depends on what distinction you are trying to make, and what you are deciding is meant by “me.” Just at the level of plain language, they are clearly different from you in that they are a different instance of you experiencing an independent stream of consciousness. You wouldn’t necessary coordinate, you would see different things at different times. You would be “inside” one of them and not the other, etc.

        • @abbenm
          link
          13 years ago

          Thing is, it’s impossible to know if your consciousness will remain after a process or if you will die and a new consciousness that thinks like you created until you try it.

          Wait, what? I think there are some cases where it’s pretty obvious your consciousness wouldn’t be transferred. If your brain is (say) scanned and recreated digitally, and your original brain is destroyed, it stands to reason there wasn’t a transference.

          I think you may be right that there’s an open question if you were to do a gradual, bit-by-bit transference, maybe. But as a matter of principle, we should hope that we know enough about how consciousness works to say something meaningful about it, and not just declare that by “its nature” or by “definition” it can never be known but from the inside.

          If you continue existing, then great, but if you are replaced by a replica, anyone observing you has no way of knowing because the replica will think that it existed prior to its creation.

          That’s a question of who knows what, which should be distinguished from the underlying question of whether or not a genuine transference actually did happen.

    • ☆ Yσɠƚԋσʂ ☆
      link
      33 years ago

      Whether that’s the case or not really depends on the process. For example, imagine if you could replace individual neurons with artificial ones while you were still conscious. You wouldn’t notice losing a single neuron, and if you replaced all the neurons over time, then by the end of the process you will have transferred your consciousness to a new substrate. Obviously, that isn’t a practical approach. However, it shows that this is possible to do in principle.

      A more realistic option would be to integrate an artificial system into the corpus callosum. Our brain is already split into two independent hemispheres that communicate over this channel. So, you could have the new hemisphere integrate with it, and map out one of the original hemispheres to the point where it’s able to mirror it. Then you could swap out each part of the brain in turn with an artificial version.

      • @abbenm
        link
        33 years ago

        You can, and this is a great point. However (1) I think most of the time these hypotheticals get discussed, this is never contemplated or clarified at all, and generally people are just talking about copies rather than transfer of a single continuous identified.

        And (2), if we’re really cracking this open, we may have to confront the idea that this kind of transfer is already happening all the time biologically. And we may have to accept that there isn’t a single entity preserved across time. And if we really, really, really dig, it’s not just about physical substrate, it’s how much of yourself you lose to lost memories. And I think we have to take the passage of time seriously, and note that past instances of you are lost, in a sense. It’s like Derek Parfit’s quote about the glass tunnel, essentially.

        • ☆ Yσɠƚԋσʂ ☆
          link
          53 years ago

          Right, we’re constantly changing over time and we’re not the same person we were in the past. We have the notion of continuity and persistent identity over time, but it’s just a mental construct.

          Another interesting aspect is that we’re technically hive minds since each hemisphere can support a human consciousness all on its own. This is a great article on the subject.

          • @abbenm
            link
            3
            edit-2
            3 years ago

            You know, on re-reading my comment I thought I was digressing too much and was going to lose people. But you actually picked up the ball and advanced it farther down the field. Well done! I am new to this idea that we are (or are possibly?) hiveminds already, so I’ll take a look at the article.

            Also I had a related thought: suppose, like @AgreeableLandscape@lemmy.ml, we have a concern about whether or not we were “really” copied. In some far-off-future where We Have The Technology, perhaps before fully “transferring” a person, you could wake them up in a state where they are simultaneously present in their own body, and in whatever medium they are being copied to. Then, when satisfied, such a person can “approve” their complete transfer, satisfied that their stream of consciousness would be sustained without disconnection, on whatever their new medium is. Although it would be super trippy to experience having two bodies, or two streams of thought (or whatever) at the same time during the intermediary phase.

            • ☆ Yσɠƚԋσʂ ☆
              link
              23 years ago

              Right, I think the continuation of consciousness is key to knowing that you’re not a copy. If the process is done in such a way where you remain conscious throughout then you know that you’re a continuation of the same process. Learning to be me is a fun short story exploring this idea as well.

              Also worth considering that a separate physical body isn’t the only way a mind could be extended. For example, artificial parts of the brain could integrate with virtual reality or the internet in general. So, it doesn’t need to even be a physical copy just your mind expanding into new domains that were inaccessible before. The whole idea of having a single body could become obsolete. And it’s reasonable to imagine our minds could adapt to this seeing how we experience similar thing when we play video games. We see our character as a third person avatar we control in many games, and the experience can feel quite immersive.

      • @AgreeableLandscape
        link
        1
        edit-2
        3 years ago

        Also, bionic brains is not the same as transferring a brain to a computer. You’re faced with the exact same problem of whether the original consciousness dies and a new one is created if you try to move it out of the artificial brain you created. Uploading a file still leaves the original.

        • ☆ Yσɠƚԋσʂ ☆
          link
          23 years ago

          I’m not talking about a bionic brain. I’m saying that you can start integrating artificial parts into a live brain and overtime convert the brain to an artificial one. You’re not faced with the exact same problem because you can turn off biological parts as artificial ones take over without disruption to consciousness. The file copying analogy doesn’t really work here.

      • @AgreeableLandscape
        link
        1
        edit-2
        3 years ago

        I feel like brain backup technology is still in the realm of sci-fi right now and it’s too early to say what can and can’t work and what could silently kill you and create a replica of you. We also have no idea how consciousness works.

        • ☆ Yσɠƚԋσʂ ☆
          link
          23 years ago

          We obviously don’t have the technology to do this right now, but we can make thought experiments about it. While we don’t fully understand consciousness, that’s not the same as saying we have no idea at all about the nature of consciousness. Fundamentally, any procedure where you remain conscious throughout is not creating a replica since there is no disruption in the conscious process.

          • @abbenm
            link
            23 years ago

            Right, and we can try to make progress on deciding whether a question falls into the category of logistical/technological, or whether it’s the type that wrestles with a big, profound philosophical question (e.g. the philosophical claim that You Can’t Copy Brains To Computers Because They’re Different And That’s That, or YCCBTCBTDATT for short.)

    • @abbenm
      link
      23 years ago

      If you back up your brain and then your own brain dies, you won’t be in the computer

      You’ve hit the nail on the head with this. Whether it’s cloning, teleporation, or mind-uploading, you always run into confusion on the fact that copies are different instances of you.

  • @Kamui
    link
    13 years ago

    O___o;; This reminds me of that one Black Mirror episode where a comatose patient’s mind ended up in a teddy bear. It was against the law to delete her consciousness, so she’s just there, in that bear, forever…