• bioemerl@kbin.social
    link
    fedilink
    arrow-up
    22
    arrow-down
    2
    ·
    1 year ago

    As it fucking should be, this is like the third time they have tried to redesign the chip to get around export restrictions

    • baconisaveg@lemmy.ca
      link
      fedilink
      arrow-up
      11
      arrow-down
      4
      ·
      edit-2
      1 year ago

      And yet all of the good, open source LLM work is coming out of China. But hey, maybe this will spur some ROCm development so AMD cards don’t suck so fucking hard in that space.

        • baconisaveg@lemmy.ca
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          I liked your original comment before the edit, it’s true. AMD only has consumer grade hardware, which is the main reason the majority of the AI projects don’t support it. Code written for CUDA can work on a A100 just as well as a 3090, there’s no money in developing software to run on someone’s 4 year old gaming PC.

      • fruitycoder@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Right? It’s really unfortunate that so many of the Chinese people that make awesome stuff get limited because CCP has garnered so much distrust that they won’t use AI to advance their surveillance and authoritarian control systems both domestically and abroad.

        I mean it was also a shame how much the actions of the NSA weakend trust in West as well.