Google is embedding inaudible watermarks right into its AI generated music::Audio created using Google DeepMind’s AI Lyria model will be watermarked with SynthID to let people identify its AI-generated origins after the fact.

  • interceder270@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    12
    ·
    1 year ago

    Yikes. TIL you think music sounds good based on how much time went into making it, not how it actually sounds.

    Can’t wait for you to hear something you like then pretend it’s bad when you find out it was made by AI.

    • WillFord27@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      3
      ·
      1 year ago

      This assumes music is made and enjoyed in a void. It’s entirely reasonable to like music much more if it’s personal to the artist. If an AI writes a song about a very intense and human experience it will never carry the weight of the same song written by a human.

      This isn’t like food, where snobs suddenly dislike something as soon as they find out it’s not expensive. Listening to music often has the listener feel a deep connection with the artist, and that connection is entirely void if an algorithm created the entire work in 2 seconds.

      • Meowoem@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        1 year ago

        That’s a parasocial relationship and it’s not healthy, sure Taylor Swift is kinda expressing her emotions from real failed relationships but you’re not living her life and you never will. Clinging to the fantasy of being her feels good and makes her music feel special to you but it’s just fantasy.

        Personally I think it would be far better if half the music was ai and people had to actually think if what their listing to actually sounds good and interesting rather than being meaningless mush pumped out by an image obsessed Scandinavian metal nerd or a pastiche of borrowed riffs thrown together by a drug frazzled brummie.

      • interceder270@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        7
        ·
        1 year ago

        What if an AI writes a song about its own experience? Like how people won’t take its music seriously?

        • WillFord27@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          1 year ago

          It will depend on whether or not we can empathize with its existence. For now, I think almost all people consider AI to be just language learning models and pattern recognition. Not much emotion in that.

          • crispy_kilt@feddit.de
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            1 year ago

            just language learning models

            That’s because they are just that. Attributing feelings or thought to the LLMs is about as absurd as attributing the same to Microsoft Word. LLMs are computer programs that self optimise to imitate the data they’ve been trained on. I know that ChatGPT is very impressive to the general public and it seems like talking to a computer, but it’s not. The model doesn’t understand what you’re saying, and it doesn’t understand what it is answering. It’s just very good at generating fitting output for given input, because that’s what it has been optimised for.

        • Inmate@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          1 year ago

          “I dunno why it’s hard, this anguish–I coddle / Myself too much. My ‘Self’? A large-language-model.”

        • wildginger@lemmy.myserv.one
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          1 year ago

          Language models dont experience things, so it literally cannot. In the same way an equation doesnt experience the things its variables are intended to represent in the abstract of human understanding.

          Calling language models AI is like calling skyscrapers trees. I can sorta get why you could think it makes sense, but it betrays a deep misunderstanding of construction and botany.

            • wildginger@lemmy.myserv.one
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              1
              ·
              1 year ago

              It is not a measure of validity. It is a lack of capacity.

              What is the experience of a chair? Of a cup? A drill? Do you believe motors experience, while they spin?

              Language models arent actual thought. This isnt a discussion about if non organic thought is equivalent to organic thought. Its an equation, that uses words and the written rules of syntax instead of numbers. Its not thinking, its a calculator.

              The only reason you think a language model can experience is because a marketing man missttributed it the name “AI.” Its not artificial intelligence. Its a word equation.

              You know how we get all these fun and funny memes where you rephrase a question, and you get a “rule breaking” answer? Thats because its an equation, and different inputs avoid parts of the calculation. Thought doesnt work that way.

              I get that the calculator is very good at calculating words. But thats all it is. A calculator.

              • WillFord27@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                1 year ago

                Oddly, I’d find a piece of music written by an ai convinced it was a chair extremely artistic lol. But yeah, just because the algorithm that’s really good at putting words together is trying to convince you it has feelings, doesn’t mean it does.

    • Marin_Rider@aussie.zone
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I don’t think that’s OPs point, but it’s interesting how many classic songs were written in less than 30 minutes

      • Obi@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        As someone that’s more than dabbled in making music, the best tracks I made all came out rather quickly, they still needed a lot of work to finish/polish but tracks that I would spend hours coming up with the core elements would usually be trash and end in the bin, the good stuff would just…happen.

    • null@slrpnk.net
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      7
      ·
      1 year ago

      That’s not really a gotcha though. They’re saying they aren’t going to actively seek out and listen to auto-generated music. If they happen to hear some and like it, that wouldn’t mean they actively sought it out and listened to it.

        • null@slrpnk.net
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          8
          ·
          edit-2
          1 year ago

          Right, they’re not going to actively put time into listening to music generated by AI.

          Hearing music made by AI because it happens to be playing is different from knowingly listening to it. It’s alarming that you need this spelled out so much.

          • WillFord27@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Don’t know why you’re being downvoted, you’re completely right. I’d never seek out to listen to something with no human thought process behind it