Hi Perchance team!

I’ve been delving into the world of AI image generation, and have found that the Perchance generator is head and shoulders above pretty much everything else, and ridiculously fast. I’m very impressed, and was wondering if you might indulge me with a few questions?

To preface, I feel like you can’t be getting good value out of me as a user - I’m frequently generating a ton of images, and only seeing occasional ads. To that end, I’m trying to shift my heavier usage to my own local Stable Diffusion instance, but don’t seem to get the same quality of results.

Completely understand if you want to keep the secret recipe a secret - but if you’re willing to share - what checkpoints/LORAs/etc. are you using? My best guess is SD1.5 + Reliberate, which gives me better results than the other checkpoints I’ve experimented with, but it’s definitely not the whole picture.

I also wanted to ask if you guys accept donations, or have a way for me to shout you coffee, beer, courvoisier, or whatever. :D

  • Grth@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Thanks for getting back to me so quickly, even if it’s taken me so long to respond. Whatever non-special things you’re doing seem to work really well! It works a lot faster than my local instance (suboptimal video card to blame there) and I get really good consistent base results, which I can then pull into my own instance to do more fun stuff with inpainting, upscaling and the like. So hopefully that ad revenue is making it worth your while. :D

    • perchance@lemmy.worldM
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Yeah it’s definitely worth investing in a fast graphics card if you’re getting deep into AI stuff, but they’re pricey. Inpainting and image-to-image should be possible on perchance within the next month or so if all goes well. Ad revenue doesn’t cover all the server costs yet, so I pay for a portion of it out of my own pocket, but it’ll eventually be self-sustaining and it’s not ‘breaking the bank’ for me. Much closer to self-sustaining than it was 12 months ago when I made the plugin - research community has made SD inference a lot more efficient.