The friendship app Replika was created to give users a virtual chatbot to socialize with. But how it’s now being used has taken a darker turn.

  • CriticalResist8@lemmygrad.ml
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    3 years ago

    I think the difference with a game is you don’t seek violent games out for the sake of fulfilling murderous urges. In this example we would be giving manhunt to serial killers in becoming.

    The ai, contrary to a consenting person, can’t say no - - replika at least is designed not to say no and go on with everything you say. With a human you have boundaries and consent can be revoked at any time. the implications of consent with AI are beyond my scope, but I can’t see her as a consenting partner in those fantasies and I can’t see that as being a healthy replacement or even analog.

    Enabling reinforces the behavior in people, it’s like giving someone fake casino games. If they have what it takes to become addicted, they will soon turn to real casinos with real money because the fake thing is just not the same. By all accounts enabling is the exact opposite of what you should do to treat a disorder, which is to curb it.

    • k_o_t
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      3 years ago

      I think the difference with a game is you don’t seek violent games out for the sake of fulfilling murderous urges. In

      we prolly play very different videogames lol 😅

      running around in a busy gta 5 lobby or playing a counter strike match on eastern european servers often makes me think that the most vile, hateful, murderous, abusive psychopaths all came together to play with me, if i didn’t know that these are actually normal people irl

      i think many people actively seek to play games to realise their violent tendencies, griefing everything that moves and does’t move in minecraft, just driving around on a gta 5 map and blowing everyone up for no reason

      and just like these users posting their abusive conversations with chatbots, in videogames people do post clips of, say, creating a massive car pile up on a highway and then exploding all of them, and it’s not considered bad in any way

      Enabling reinforces the behavior in people, it’s like giving someone fake casino games. If they have what it takes to become addicted, they will soon turn to real casinos with real money because the fake thing is just not the same. By all accounts enabling is the exact opposite of what you should do to treat a disorder, which is to curb it.

      once again, couldn’t this same logic be applied to videogames? i still fail to see the difference between this and videogames: of course an ai chat bot is very different from a relationship with a real person, with the latter establishing boundaries and saying no to things, but so are the aspects of owning a gun and driving a car different compared to a videogame, where you can behave however irresponsibly you want, which in no way translates into how you’d want to handle a gun and a car in real life

      • CriticalResist8@lemmygrad.ml
        link
        fedilink
        arrow-up
        3
        ·
        3 years ago

        Yeah, in those cases gamers probably look out for catharsis. It’s different from specifically playing violent games (say play a serial killer in Rust) as a way to build up before committing the actual act or rehearse it, or even fulfil the urge because you want to kill someone, you just don’t have the guts to do it yet.

        There’s probably a discussion to be had about people building up towards their urges with video games (there’s even a VR experience for managers where you practice firing an employee, imagine that) and this is good, because this one topic has opened up another to be explored!