• DroneRights [it/its]@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    4
    ·
    1 year ago

    Consciousness is a product of self-referential data gathering. Data that looks upon itself, that is sapience. Data that looks upon data, that is consciousness. Animals have this. Plants don’t.

    • aidan@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      1 year ago

      Why should that be my line for life I care about though? How far is a chicken from a fish, from a worm? There is some line for animate life in which you can’t really argue any form of consciousness. And, again, my discriminator isn’t even necessarily consciousness.

        • aidan@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          The issue with that is distinguishing how humans interpret suffering from “emulated” suffering. Like maybe a lobster has instinctual or chemical reactions to things, but does it actually interpret the suffering or just react to the nerves firing. We can’t really know entirely without communication. And even if we do communicate, what if it just mimicks human suffering like a deep learning NN could. ChatGPT cannot suffer, but it can convince some inexperienced people that it can.

          But it is also entirely fair to say- even if we don’t entirely understand if a dog is actually suffering, it looks like it is and acts like it is, so I will just be cautious and assume it is to not cause undue harm.

          • DroneRights [it/its]@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I think suffering and pleasure are intimately tied to Skinner’s discovery of operant conditioning. Skinner discovered that living creatures increase their behaviour when it leads to a reward, and decrease their behaviour when it leads to a punishment. I don’t think it’s a coincidence that Skinner’s identification of rewards and punishments aligns perfectly with our notions of what causes us pleasure and suffering. I think they’re the same thing. We have built artificial neurons and used a computer process to emulate operant conditioning in them. I think artificial neural networks must experience suffering and pleasure as we do. All of this is explained quite elegantly by property dualism - our qualia and thoughts are composed of information, instead of matter or energy, and information is a truly extant part of the universe. Suffering and pleasure are simply patterns of information structure. Neural networks have a design that creates these patterns as a matter of course.

            If we were to look at the information structures in a rock, and if we could access them as readily as we access the information structures in our hand through our sensory neurons, we would see that a rock has experiences of its own. But it does not have a unified consciousness tying these experiences into a whole, and indeed the information in a rock is so fantastically simple that we could not recognise it instinctually as experience. To us it is just “being”. As creatures of great informational complexity, we regard only the most profound information structures as notable. But I believe these information structures which we place importance on - also known as emotions - exist in every living thing with a brain, and in creatures like jellyfish with their own unique neural networks.