But why would you want to opt out of this, Luddite? smuglord

  • ObamaSama [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    28
    ·
    3 months ago

    Every day I feel more vindicated in my choice to drop social media nearly a decade ago. Remember being young and being told to never ever post any personal info online? Wild how society just kinda skipped over that whole thing

    • Azarova [they/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      11
      ·
      3 months ago

      Felt like that went out the window so fast too. Like as soon as Facebook became popular, suddenly it was totally fine to post your full name online.

  • Awoo [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    19
    ·
    edit-2
    3 months ago

    The methods we’re currently using to create AI suck ass if it requires the data sum of all human creation just to create eldritch horrors with 6 arms 3 legs and the wrong number of fingers on every hand. A child doesn’t get even 1% of that amount of data and can do better at 5 years old if they’ve been practicing art the whole time.

    The entire approach to the problem is wrong.

    • KobaCumTribute [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      3 months ago

      Just going off open source models, most of them are at least somewhat better than that out of the box, and certain strains of model have a bunch of tools built around forcing it into a more reasonable state (like modern descendants of SD1.5 - a model from two years ago - fail hard on their own 9 times out of 10, but there’s a mature ecosystem of tools to control post-NAI-leak SD1.5 and cover for its flaws). The most recent big model, Flux, is terrifyingly accurate on its own although it still tends to get proportions subtly wrong in very offputting ways.

      Closed source models are impenetrable and no one knows what they’re doing under the hood since the public only interacts with them with prompt boxes and every company is being super secretive about whatever they’re doing.

      That said, I do agree that it feels like there’s something fundamentally wrong with the way “AI” is currently being focused on. It’s like what’s being trained now are eyes (and backwards eyes that reverse the process of seeing to produce an image from how that image might be parsed) and speech centers, and there’s all this hype that the speech center bits can just be made big enough to start being smart instead of just being a potentially useful bit of language processing, and I really can’t help but feel like it’s just a flawed overfixation on one novel bit of tech kind of like how rockets became the standard for space launches because primitive rocket tech looked neat and had already been developed.

  • sometimes when I talk IRL about privacy issues online and the impulse to not share my user data, shopping habits, with anyone and everyone etc, I get pushback from otherwise smart people who want to know why I don’t want to be tracked. I’m not even talking like doing Snowden type shit. just like very basic avoidance. like not using advertisers’ browsers or installing all the treat apps for discount treats at the treat shoppe.

    it always throws me, because it seems self evident. but these people seem to think receiving highly specific and targeting marketing benefits them by showing them products and services they are interested in.

    I can’t really wrap my mind around it, except my guess is they believe they have complete control and agency over their attention and would never be influenced to do something or believe something against their interests.

    which is comes across as super naive to me.

    • Belly_Beanis [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      3 months ago

      When this conversation comes up, my go-to is talking about people getting doxxed who were innocent. Boston Bomber, that airplane doctor, a lot of kids/minors, etc. It’s easy for people to grasp they can randomly be targeted by a lynch mob because they look like someone or their name matches the wrong guy.

      You can also get into deep dives on people finding out where a photo was taken just by looking at things like which way the wind was blowing, contrails in the sky, or where the sun is. You explain this can be used to track their kids or find out where they live.

      Furthermore, just by figuring out a person’s relationships, people can figure out important security questions (mother’s maiden name, where they went for third grade, etc.). These can be used for ID theft.

      My concern personally is there’s no protections on privacy. As we’ve seen with the repeal of Roe v. Wade, social media and electronics can be used to track people in order to convict them with crimes. This can happen with any issue. Republicans may be fine with abortion being outlawed, but how are they going to feel when democrats finally repeal the second ammendment and throw them into FEMA camps or whatever nonsense conservatives are thinking of? The government can figure out if you’re part of a gun club or maybe sees you in the background of your friend’s picture at a shooting range.

      You start piling on bad things that can happen to them and people start realizing they need to delete Facebook and Instagram.

    • Sauerkraut@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      The entire point of marketing is to manipulate us into buying shit we don’t need and didn’t want. If it didn’t work then companies wouldn’t throw billions at it.

  • keepcarrot [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    15
    ·
    3 months ago

    I do not welcome Australian AI. Also Australian facebook users are especially garbage versions of Australians, who are already pretty trash.