I don’t know much about graphics cards, but the framework laptop seems to offer an “AMD Radeon™ RX 7700S” and stable diffusion requires Linux ROCm.

It’s not completely clear if ROCm runs on AMD Radeon™ RX 7700S, so I was wondering if anyone had any experience with setting it up on framework.

  • velox_vulnus
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    4 months ago

    I would like to retract the first sentence. It is misleading - the model can run without any hiccups with ROCm, and is probably a decent GPU for this job. However, I will provide more context on the second sentence.

    I tried training and running models (GAN, LLM) on my Lenovo IdeaPad S540 (i5-8265U + 8GB + MX250 2GB laptop). The result of such heavy computation is that the battery is toast, the barrel port adapter is severely damaged where it connects with the board, the metallic panels have stress marks due to uneven heating, and the soldered RAM is a goner. And the battery has no capacity. Remember those old TVs where you had to hit them to make it work? That’s the state of my device right now.

    The cooling system in a laptop isn’t the best. So, it would be better to run the model on a desktop or cloud GPU. By the way, older stable diffusion models seem to work on something as old as RX580 with ROCm.

    • s12@sopuli.xyzOP
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      4 months ago

      I see. Thank you.

      Would keeping it plugged in or removing the battery help with the battery issue?

      Edit: Also, is there any way to force the GPU to throttle earlier to reduce damage?

      • velox_vulnus
        link
        fedilink
        English
        arrow-up
        5
        ·
        4 months ago

        I think that would still be a bad idea - running billions of parameters on a laptop - because it’s not just the battery, you might also want to think about how much the heat-sink can handle cooling. However, there’s nothing wrong with running the model a few times. If that were a desktop GPU, it would handle a load as heavy as that easily.

        Might I suggest you try using KOALA once? It has way lesser parameters than your typical large-scale diffusion model, so it will be forgiving on your device. Best of luck with your attempt.