The friendship app Replika was created to give users a virtual chatbot to socialize with. But how it’s now being used has taken a darker turn.

  • CriticalResist8@lemmygrad.ml
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    3 years ago

    I think it’s actually enabling their desires, and the AI is secondary. We know from people that work with robots (like Spot from BD) that they start humanising them and feel empathy towards what are essentially machines. It’s a very neat capacity we have, where we even give them names, personal pronouns, and personalities.

    I tried GP3 AIs before (AI Dungeon for example) and it’s convincing enough. Sometimes they lose track of the conversation or just reply something generic (e.g. you ask if they like flowers specifically and they just say “I love them”, not “I love flowers”). But with the added context of this AI being presented in a text convo, it can feel like you’re texting someone real.

    • k_o_t
      link
      fedilink
      arrow-up
      4
      ·
      3 years ago

      wouldn’t this logic be applicable to videogames also? when i’m playing a game, if at any random point i’d be interrupted and asked whether the people in the game are real, i’d of course say no, but whenever i’m playing the game, i don’t ever think about that: if it’s a good game, i feel sad for the losses of the characters, feel on edge in tense situations, and happy for their gains, even knowing that these characters aren’t real

      but even if it’s enabling, why is this a bad thing? for example, bdsm-esque fantasies are fairly common, and as long as people play them out in a consenting, safe manner, there’s nothing harmlful about it, and people aren’t being dragged into mental asylums for it

      i’m not exactly sure what the nature of verbal abuse the article is discussing, but i think it’s a fair comparison

      • CriticalResist8@lemmygrad.ml
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        3 years ago

        I think the difference with a game is you don’t seek violent games out for the sake of fulfilling murderous urges. In this example we would be giving manhunt to serial killers in becoming.

        The ai, contrary to a consenting person, can’t say no - - replika at least is designed not to say no and go on with everything you say. With a human you have boundaries and consent can be revoked at any time. the implications of consent with AI are beyond my scope, but I can’t see her as a consenting partner in those fantasies and I can’t see that as being a healthy replacement or even analog.

        Enabling reinforces the behavior in people, it’s like giving someone fake casino games. If they have what it takes to become addicted, they will soon turn to real casinos with real money because the fake thing is just not the same. By all accounts enabling is the exact opposite of what you should do to treat a disorder, which is to curb it.

        • k_o_t
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          3 years ago

          I think the difference with a game is you don’t seek violent games out for the sake of fulfilling murderous urges. In

          we prolly play very different videogames lol 😅

          running around in a busy gta 5 lobby or playing a counter strike match on eastern european servers often makes me think that the most vile, hateful, murderous, abusive psychopaths all came together to play with me, if i didn’t know that these are actually normal people irl

          i think many people actively seek to play games to realise their violent tendencies, griefing everything that moves and does’t move in minecraft, just driving around on a gta 5 map and blowing everyone up for no reason

          and just like these users posting their abusive conversations with chatbots, in videogames people do post clips of, say, creating a massive car pile up on a highway and then exploding all of them, and it’s not considered bad in any way

          Enabling reinforces the behavior in people, it’s like giving someone fake casino games. If they have what it takes to become addicted, they will soon turn to real casinos with real money because the fake thing is just not the same. By all accounts enabling is the exact opposite of what you should do to treat a disorder, which is to curb it.

          once again, couldn’t this same logic be applied to videogames? i still fail to see the difference between this and videogames: of course an ai chat bot is very different from a relationship with a real person, with the latter establishing boundaries and saying no to things, but so are the aspects of owning a gun and driving a car different compared to a videogame, where you can behave however irresponsibly you want, which in no way translates into how you’d want to handle a gun and a car in real life

          • CriticalResist8@lemmygrad.ml
            link
            fedilink
            arrow-up
            3
            ·
            3 years ago

            Yeah, in those cases gamers probably look out for catharsis. It’s different from specifically playing violent games (say play a serial killer in Rust) as a way to build up before committing the actual act or rehearse it, or even fulfil the urge because you want to kill someone, you just don’t have the guts to do it yet.

            There’s probably a discussion to be had about people building up towards their urges with video games (there’s even a VR experience for managers where you practice firing an employee, imagine that) and this is good, because this one topic has opened up another to be explored!