Google is embedding inaudible watermarks right into its AI generated music::Audio created using Google DeepMind’s AI Lyria model will be watermarked with SynthID to let people identify its AI-generated origins after the fact.

  • WillFord27@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    3
    ·
    1 year ago

    This assumes music is made and enjoyed in a void. It’s entirely reasonable to like music much more if it’s personal to the artist. If an AI writes a song about a very intense and human experience it will never carry the weight of the same song written by a human.

    This isn’t like food, where snobs suddenly dislike something as soon as they find out it’s not expensive. Listening to music often has the listener feel a deep connection with the artist, and that connection is entirely void if an algorithm created the entire work in 2 seconds.

    • Meowoem@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      1 year ago

      That’s a parasocial relationship and it’s not healthy, sure Taylor Swift is kinda expressing her emotions from real failed relationships but you’re not living her life and you never will. Clinging to the fantasy of being her feels good and makes her music feel special to you but it’s just fantasy.

      Personally I think it would be far better if half the music was ai and people had to actually think if what their listing to actually sounds good and interesting rather than being meaningless mush pumped out by an image obsessed Scandinavian metal nerd or a pastiche of borrowed riffs thrown together by a drug frazzled brummie.

    • interceder270@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      7
      ·
      1 year ago

      What if an AI writes a song about its own experience? Like how people won’t take its music seriously?

      • WillFord27@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        It will depend on whether or not we can empathize with its existence. For now, I think almost all people consider AI to be just language learning models and pattern recognition. Not much emotion in that.

        • crispy_kilt@feddit.de
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          1 year ago

          just language learning models

          That’s because they are just that. Attributing feelings or thought to the LLMs is about as absurd as attributing the same to Microsoft Word. LLMs are computer programs that self optimise to imitate the data they’ve been trained on. I know that ChatGPT is very impressive to the general public and it seems like talking to a computer, but it’s not. The model doesn’t understand what you’re saying, and it doesn’t understand what it is answering. It’s just very good at generating fitting output for given input, because that’s what it has been optimised for.

      • Inmate@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1 year ago

        “I dunno why it’s hard, this anguish–I coddle / Myself too much. My ‘Self’? A large-language-model.”

      • wildginger@lemmy.myserv.one
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        1 year ago

        Language models dont experience things, so it literally cannot. In the same way an equation doesnt experience the things its variables are intended to represent in the abstract of human understanding.

        Calling language models AI is like calling skyscrapers trees. I can sorta get why you could think it makes sense, but it betrays a deep misunderstanding of construction and botany.

          • wildginger@lemmy.myserv.one
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            1 year ago

            It is not a measure of validity. It is a lack of capacity.

            What is the experience of a chair? Of a cup? A drill? Do you believe motors experience, while they spin?

            Language models arent actual thought. This isnt a discussion about if non organic thought is equivalent to organic thought. Its an equation, that uses words and the written rules of syntax instead of numbers. Its not thinking, its a calculator.

            The only reason you think a language model can experience is because a marketing man missttributed it the name “AI.” Its not artificial intelligence. Its a word equation.

            You know how we get all these fun and funny memes where you rephrase a question, and you get a “rule breaking” answer? Thats because its an equation, and different inputs avoid parts of the calculation. Thought doesnt work that way.

            I get that the calculator is very good at calculating words. But thats all it is. A calculator.

              • WillFord27@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                1 year ago

                Personally, I choose to believe that the people around me are real. In theory, you can’t trust anyone but yourself. I know language models don’t have humanity. I guess that’s the difference.

                • interceder270@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  1 year ago

                  You only have thoughts because of electricity.

                  Do you believe in the existence of a soul or some other god-gene that separates us from machines?

                  • wildginger@lemmy.myserv.one
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    arrow-down
                    4
                    ·
                    1 year ago

                    No one said anything about electricity. A calculator can exist on paper, or stones on sticks.

                    No one said anything about souls. Please dont make up shit no one said.

                    I am not an equation. I do not take X input to produce Y output. My thoughts do not require outside stimuli. My thoughts do not give the same output for the same input. I can think, and ambulate and speak, inside a dark room with no stimulus based entirely on my own thoughts.

                    Chatgpt, and other language models, are equations. They trick you by using random number generation to simulate new outputs to repeat inputs, but if you open the code running the equation and learn how to fix the rng to a set value, you get the same outputs for each input.

                    Its not thought. Its an equation.

                    I am not saying non organic thought isnt possible. I am saying that a salesman pointed at a very very very big calculator and said “it definitely thinks! Its more than an equation!” And you, along with a lot of news outlets, fell for it.

                    We do not have machine brains yet. Someone just tried to sell calculators as if they were.

            • WillFord27@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              Oddly, I’d find a piece of music written by an ai convinced it was a chair extremely artistic lol. But yeah, just because the algorithm that’s really good at putting words together is trying to convince you it has feelings, doesn’t mean it does.