I’ve been made aware that the code I’ve written to access text-to-speech from ElevenLabs API is no longer working.

I’ve tested it and it seems that the CORS-Proxy that is being used in ai-character-chat currently doens’t allow POST methods (which is being used to ‘post’ the text to be ‘spoken’ in ElevenLabs).

Not a major/priority issue but might be nice to be fixed. I also wonder how many are using text to speech (even just using the Speech Synthesis code) in the ai-character-chat

  • perchance@lemmy.worldM
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    7 months ago

    Hmm, I wasn’t able to replicate this problem. I added this as character’s custom code, and it does log the response in the console:

    oc.thread.on("MessageAdded", async function() {
      let result = await fetch("https://httpbin.org/post", {
        method: "POST",
        headers: { "Content-Type":"application/json" },
        body: JSON.stringify({foo:3})
      }).then(r => r.json());
      console.log(result);
    });
    

    Is it perhaps header values you’re missing, or something? Certainly possible that there is a big though and I’m just not hitting it with the above example for whatever reason.

    Either way this motivated me to finally get around to creating a CORs-bypassing plugin: https://perchance.org/fetch-plugin That way I can ensure strong backwards-compatibility and performance instead of just relying on a little glitch.com server, which won’t scale, and doesn’t have good perf/uptime guarantees. I’ll wait to hear back from you before integrating it into ai-character-chat just so we can don’t end up making it harder for you to reproduce the bug you were up against here.

      • perchance@lemmy.worldM
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        7 months ago

        @Alllo@lemmy.world @VioneT@lemmy.world Sorry just realized the plugin is not quite ready for prime time! Need to make some breaking changes so I’ve commented out the code for now. The moment I started integrating it into ai-character-chat I realised it’s a bad idea to overwrite the existing fetch - should instead just be a separate “superFetch” or whatever because otherwise if you need to e.g. download a big file that you know doesn’t have any CORs problems, you’re forced to go through the proxy, which will be slower. Will update soon hopefully, if not tomorrow. Sorry for trouble if you’d started playing with it!

      • perchance@lemmy.worldM
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        7 months ago

        Okay, thanks for the example, I think it’s all fixed now. Hopefully I didn’t break anything. Been up for two days straight tho so I wouldn’t bet on it, but I did some basic tests and it seems good. Will check lemmy messages first thing tomorrow 🫡

        • allo@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 months ago

          haha. one and a half days for me and maybe 30+ hours straight just now on something with your wonderful comments update :) can’t wait to share it! No sleeping yet!

          • BluePower@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            7 months ago

            Same 😄

            I’ll complete and share the project possibly (and hopefully) as an actual plugin after the post-announcement update of my generator hub page, but I’ll be releasing the “early implementations” somewhere in my experiment generator so everyone can try it right now and give some feedback on it.

        • VioneT@lemmy.worldOPM
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          7 months ago

          It seems to work now, though there are some inconsistencies with the chunk text arrangement, which causes the text in the stream to be quite jumbled. I’m looking into it now, I’ll update if it is still inconsistent with the order.

          EDIT: It is inconsistent with the order of the chunks. Maybe there is a way to parse it in order? Currently I’m having it pushed into an array, then sort that array by the index, then join the sorted array to be a string before pushing it to the t2s, though it is still inconsistent and sometimes the streaming finishes and the text to be spoken is not yet queued up.

          Here are the code hacks to re-sort the chunks to order lmao.

          • perchance@lemmy.worldM
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            I wasn’t able to reproduce this when trying it with this code:

            oc.thread.on("StreamingMessage", async function (data) {
              let lastChunkI = -1;
              let chunksText = "";
              let chunks = [];
              for await (let chunk of data.chunks) {
                chunks.push(chunk);
                chunksText += chunk.text;
                if(chunk.index !== lastChunkI+1) console.warn("OUT OF ORDER CHUNKS!", chunks);
                lastChunkI++;
              }
              console.log("chunks:", chunks);
              console.log("chunksText:", chunksText);
            });
            

            Or have I misunderstood the problem?

            • VioneT@lemmy.worldOPM
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              6 months ago

              You have. On trying your code. it gave me the OUT OF ORDER CHUNKS!:



              Here’s the character I used with the custom code to check the out of order chunks.: Link to Character

              • perchance@lemmy.worldM
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                6 months ago

                Ahh, thank you! Was very confused at first because I iteratively made your character closer to the default Assistant to work out why it was happening in yours but not the assistant, and found that the profile pic was the cause lmao, but eventually realised that it was because data url vs normal URL, and the data url (being larger) was making an async IndexedDB request take a few milliseconds longer, which caused the out-of-order-ness. But I shouldn’t have even been doing those db requests in the first place, so I’ve removed them, and this race condition type bug shouldn’t be possible at all now. Thanks again!!