So I’ve been trying to install the proprietary Nvidia drivers on my homelab so I can get my fine ass art generated using Automatic1111 & Stable diffusion. I installed the Nvidia 510 server drivers, everything seems fine, then when I reboot, nothing. WTF Nvidia, why you gotta break X? Why is x even needed on a server driver. What’s your problem Nvidia!

  • Crayphish@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    27
    ·
    1 year ago

    For what it’s worth, NVIDIA’s failings on Linux tend to be mostly in the desktop experience. As a compute device driven by cuda and not responsible for the display buffer, they work plenty good. Enterprise will not be running hardware GUI or DEs on the machines that do the AI work, if at all.

    • Aasikki
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Even the old 1060 in my truenas scale server, has worked absolutely flawlessly with my jellyfin server.

    • Diplomjodler@feddit.de
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      1 year ago

      They don’t give a fuck about consumers these days and Linux being just a tiny fraction of the userbase, they give even less of a fuck.

    • Klara@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      I’ve had a bunch of issues with my GTX 1080 before I switched to an AMD RX 5700 XT. I love it, but I recently put the 1080 back in use for a headless game streaming server for my brother. It’s been working really well, handling both rendering and encoding at 1080p without issue, so I guess I’ve arrived at the same conclusion. They don’t really care about desktop usage, but once you’re not directly interacting with a display server on an Nvidia GPU, it’s fine.