• beepaboopa@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    7 months ago

    I don’t understand the problem. I don’t see why virtual “embodiment” would be any more difficult problem than the rest of a ‘Brain in a vat’ setup.

    • criitz@reddthat.com
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      I think the argument comes down to this:

      Thompson and Cosmelli conclude that to really envat a brain, you must embody it. Your vat would necessarily end up being a substitute body. Note that they aren’t claiming the substitute body has to be flesh and blood.

      – That the brain requires something wrapped around it responding to it and pumping the right juices to and fro.

      Which is fine and makes total sense. From there however, the rest of the argument seems to veer completely philosophical - ie. Whether such a “body” changes the definition of what it means to “just” be a brain in a vat. It seems somewhat semantic to me, but I guess they want to make the point that consciousness is not just electrical signals.

    • TeddE@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      Agreed, the author simply asserts that brains are interactive as if that’s some big gotcha. The whole idea of the vat is that the machine is complicated to handle these things, at least well enough for your own system.

      Sanderson’s short story ‘Perfect State’ actively explores the edge cases of what a simulation world would have to do to deal with ‘i survived this thing that normally kills everyone’.

    • insomniac_lemon@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      7 months ago

      I think the idea is probably that it would need a lot of simulated data/interactions. I could see a case for both believably of the simulation (particularly with our pattern-seeking brains) and just the normal and long-term operation of the brain.

      On the second part, think about how things like inner ear problems and de-personalization exist. Or agony that comes with a plugged ear or nose. Imagine if you lost the sense of weight, pressure, or temperature… or perhaps even just lost accuracy (or gained delay) of it: do you think you would move the same? Do you think it would have any effect on your brain over time? This goes particularly for someone who grew up with a body, I don’t think it would be unlikely for most people to have some kind of dissonance from noticing inconsistencies. (now if the thought it it’s always been in a vat, that doesn’t make as much sense, though that could lead to the argument that it would create developmental differences or at least lack of attunement to physical life EDIT: also garbage-in-garbage-out, particularly even just the quality of socialization for your entire life)

      Of course I think something like brainVR could work if it was something you were aware of, but even then it would probably be better to just patch in (or not interfere with) sensory data from your body/local environment.

  • insomniac_lemon@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    If I am a BiV, there must be some very confused scientists on why my simulation has been so stagnant for so long. And likely wondering why (unrelated to this thought experiment) I focus on the idea of being a BiV cyborg.

    I think symbiosis would be the way to go when it comes to homeostasis, especially with more integration when it comes to control and management (like how modern life doesn’t really foster healthy microbiomes). Also symbiosis with more than just bacteria would be interesting, opening up new possibilities if possible. Though of course I am thinking mobile or semi-mobile existence, not just in a simulation.