• CluckN@lemmy.world
    link
    fedilink
    English
    arrow-up
    170
    ·
    8 months ago

    Pshh, I’m working on an AI blockchain cloud based customer first smart learning adaptive agile Air Fryer that will blow the competition away.

  • Pennomi@lemmy.world
    link
    fedilink
    English
    arrow-up
    138
    ·
    8 months ago

    Me too, but I make pathfinding algorithms for video game characters. The truly classic Artificial Intelligence.

    • bobotron@lemm.ee
      link
      fedilink
      English
      arrow-up
      23
      ·
      8 months ago

      I still remember fighting grunts in the original half life for the first time and being blown away. Your work makes games great!

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        22
        ·
        edit-2
        8 months ago

        It’s been a while since I looked at how Valve does it but it could be called a primitive expert system. And while the HL1 grunts were extraordinary for their time, HL2’s combine grunts are still pretty much the gold standard. Without the AI leaking information to the player via radio chatter it would feel very much like the AI is cheating because yes, HL2’s grunts are better at tactics than 99.99% of humans. It also helps that you’re a bullet sponge so them outsmarting you, like leading you into an ambush, doesn’t necessarily mean that you’re done for.

        OTOH they’re a couple of pages of state-machines that would have no idea what to do in the real world.

        Also, for the record: “AI” in gamedev basically means “autonomous agent in the game world not controlled by the player”. A “follow the ball” algorithm (hardly can be called that) playing pong against you is AI in that sense. Machine learning approaches are quite rare, and if then you’d use something like NEAT, not the gazillion-parameter neutral nets used for LLMs and diffusion models. If you tell NEAT to, say, drive a virtual car it’ll spit out a network with a couple of neurons, and be very good at doing that but be useless for anything else but that doesn’t matter you have an enemy AI for your racer. Which probably is even too good, again, so you have to nerf it.

  • Ekky@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    111
    arrow-down
    1
    ·
    edit-2
    8 months ago

    LLMs (or really ChatGPT and MS Copilot) having hijacked the term “AI” is really annoying.

    In more than one questionnaire or discussion:

    Q: “Do you use AI at work?”

    A: “Yes, I make and train CNN (find and label items in images) models etc.”

    Q: “How has AI influenced your productivity at work?”

    A: ???

    Can’t mention AI or machine learning in public without people instantly thinking about LLM.

    • smeg@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      95
      ·
      8 months ago

      I imagine this is how everyone who worked in cryptography felt once cryptocurrency claimed the word “crypto”

      • dvlsg@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        ·
        edit-2
        8 months ago

        I’m still mad that ML was stolen and doesn’t make people think about the ML family of programming languages anymore.

        • OhNoMoreLemmy
          link
          fedilink
          English
          arrow-up
          6
          ·
          8 months ago

          The term machine learning was coined in 1959 by Arthur Samuel, an IBM employee and pioneer in the field of computer gaming and artificial intelligence.[9][10] The synonym self-teaching computers was also used in this time period.[11][12]

          https://en.m.wikipedia.org/wiki/Machine_learning

          It wasn’t so much stolen as taken back.

      • Ekky@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        19
        ·
        8 months ago

        Luckily that was only the abbreviation and not the actual word. I know that language changes all the time, constantly, but I still find it annoying when a properly established and widely (within reason) used term gets appropriated and hijacked.

        I mean, I guess it happens all the time in with fiction, and in sciences you sometimes run into a situation where an old term just does not fit new observations, but please keep your slimy, grubby, way-too-adhesive, klepto-grappers away from my perfectly fine professional umbrella terms. :(

        Please excuse my rant.

    • funkless_eck@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      29
      ·
      edit-2
      8 months ago

      I had a first stage interview with a large multinational construction company where I’d be “the only person in the organization sanctioned to use ai”

      they meant: use chatgpt to generate blogs

      • MonkeMischief@lemmy.today
        link
        fedilink
        English
        arrow-up
        18
        ·
        8 months ago

        “That’s some high security clearance to have a computer rapidly tap auto-complete for entire paragraphs, hoss…wait it pays how much?(Ahem) I shall take this solemn responsibility of the highest order so very seriously!” Lol

    • marcos@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      2
      ·
      8 months ago

      We are just taking “crypto” back to mean something useful. It was just a matter of some stupid people losing enough money.

      I hope in a few years we can take “AI” back too.

    • explodicle@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      23
      ·
      8 months ago

      They’re just betting on what will get bailed out because of their own bribes. It’s pure feedback at this point; all noise and no signal.

  • Waldowal@lemmy.world
    link
    fedilink
    English
    arrow-up
    56
    ·
    8 months ago

    As an older developer, you could replace “machine learning” with “statistical modeling” and “artificial intelligence” with “machine learning”.

    • QuaternionsRock@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      8 months ago

      I think people are hesitant to call ML “statistical modeling” because traditional statistical models approximate the underlying phenomena; e.g., a logarithmic regression would only be used to study logarithmic phenomena. ML models, by contrast, seldom resemble what they’re actually modeling.

    • cmfhsu@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      8 months ago

      We used to distinguish AI as automatically / programmatically making a decision based on an ML model, but I’m guilty of calling it AI for wow factor, lol.

      Now I have to be careful because AI = LLMs in common language .

  • tory@lemmy.world
    link
    fedilink
    English
    arrow-up
    36
    ·
    8 months ago

    My old coworker used to say this all the time back around 2018:

    "What’s the difference between AI and machine learning?

    Machine learning is done in Python. AI is done in PowerPoint."

  • Emptiness@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    ·
    8 months ago

    There needs to be a new taunting community for this mlm similar to the one for #buttcoin.

  • Ephera
    link
    fedilink
    English
    arrow-up
    16
    ·
    8 months ago

    Whenever people say “AI”, I like to mentally insert an M, G and C: ✨Magic✨

    Or as it’s also known:
    ✨I don’t want to explain what I actually did, so here’s a meaningless word to stop you asking questions.✨

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    26
    arrow-down
    10
    ·
    8 months ago

    Machine learning is a subset of artificial intelligence, so I don’t see anything wrong here. The character’s using a more generic term when talking to a layperson.

    • apocalypticat@lemmy.world
      link
      fedilink
      English
      arrow-up
      38
      ·
      8 months ago

      I think the point that they’re making is that they used the latest buzz word for the people dishing out the dough.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        12
        arrow-down
        3
        ·
        8 months ago

        Yes, and I’m saying there’s nothing wrong with that “buzz word.” It’s accurate, just more generic.

        I see a lot of people these days raising objections that LLMs and whatnot “aren’t really artificial intelligence!” Because they’re operating from the definition of artificial intelligence they got from science fiction TV shows, where it’s not AI unless it replicates or exceeds human intelligence in all meaningful ways. The term has been widely used in computer science for 70 years, though, applying to a broad range of subjects. Machine learning is clearly within that range.

        • Ephera
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          There’s a distinction into “narrow AI” and “Artificial General Intelligence”.

          AGI is that sci-fi AI. Whereas narrow AI is only intelligent within one task, like a pocket calculator or a robot arm or an LLM.

          And as you point out, saying that you’re doing narrow AI is absolutely not interesting. So, I think, it’s fair enough that people would assume, when “AI” is used as a buzzword, it doesn’t mean the pocket calculator kind.

          Not to mention that e.g. OpenAI explicitly states that they’re working towards AGI.

  • Socsa@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    8 months ago

    One guy spends a summer implementing a backprop algorithm in CUDA and now my mom thinks butterflies are stealing her blood at night.