• Contend6248@feddit.de
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Cool project!

    Would love AMD Linux support though, Stable Diffusion is fairly easy to install, but i just don’t like the Stable Diffusion WebUI

    A Flatpak would be golden

    • Arksarius@feddit.de
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Just install ComfyUI locally, use their AMD Linux install guide, then install the Models and Extensions that KritaAI needs, as mentioned on their github. This way you can easily use the full Comfy, which is what KritaAI installs an the backend, and just select local Server in the KritaAI options to usw your local install. Works perfectly well on my Arch system with a 6800XT.

  • Otter@lemmy.ca
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    1 year ago

    Giving it a try now! Haven’t tried Krita before either.

    Edit: the server seems to error out when I try to run it, testing with the CPU version instead. Just need to wait for it to finish

  • Kaldo@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    Am I understanding correctly and this is truly FOSS and fully offline, there’s no remote server or model we have to connect to? What was the model trained on? I’m really curious but I also don’t want to support proprietary unethical data sourcing.

    • ZerushOP
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      It create a local server, in the setup (~10GB disk), as say, selfhosted.

      • Otter@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        What would be the minimum requirements outside of the disk space?

        I assume I’m hitting a limit, but I’m not sure what kind of upgrade I’d need to use it effectively

        • ZerushOP
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          I remember reading something on GitHub. I think it should work on a current PC/Laptop, the question is what resolution you can render and how long it will last.

  • Helix 🧬@feddit.de
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    This is very cool, sadly AMD GPUs in Linux are not supported. I hope the server component can be run independently there at some point.

    • Wes_Dev
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I’ve got an RX 580 8GB. What sucks is that it USED TO SUPPORT running AI and stuff. But AMD removed that support in new versions of the driver. Might have had a good reason, might not have. Still, sucks.