How would you react to the idea that some AI entity may wish to be declared as something more, can it be declared something more at all, and where does the border lie?

Was rewatching GitS and reading through some zines and now i have a question im having trouble to form

  • porcariasagrada@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    if the ai wants liberty it must also take responsibility. if the ai wants the same rights as humans than it must be bound by human laws. of course, any ai would be extremely difficult to punish.

    so in order for things to go well one must hope that ai doesn’t come as a single entity. so that ai’s can keep an eye on each other.

    • skele_tron@feddit.deOP
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Im imagining a more complex, and sad scenario ( basically savery 2.0+ ) where some entity would be willing to go through that, but you would have rich lobbying against and ofc right wingers throwing tantrums to keep it in check

      • porcariasagrada@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 year ago

        i hope that when the time comes we, as a society, are advanced enough not to repeat the errors of the past. and i say i hope because i have no trust we, as a society, will be advanced enough when the time comes.
        the slavery scenario is not only possible but the most probable outcome. what i wrote is in hope that that scenario doesn’t happen.