• ichbinjasokreativ@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    7 months ago

    Tesla has a decent relationship with AMD though, right? Means nvidia in nice-to-have for them, but not neccessary.

    • Endmaker@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 months ago

      How are AMD GPUs useful though? Last time I’ve heard, CUDA (and CuDNN) is still an Nvidia-only thing.

      • ichbinjasokreativ@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 months ago

        There are compatibility layers for cuda to run on AMD, and everything AI can also natively run on ROCm. It’s a choice to use nvidia, not mandatory.

        • notfromhere
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 months ago

          What is the best working compatibility layer to run cuda on AMD? ROCm seems to drop support pretty quickly after release though so it’s hard for it to get a foothold. As Karparhy has shown, doing low level C++ has some amazing results…