I work with machine learning tasks daily, both as an ML researcher and as a hobby. The difference between what I can do at work and at home is significant - an A40 at work can do far more than the 3080 I have at home. This obviously makes sense, given the massively increased price point.

However, what I find odd is how there are no consumer level server GPUs targeted towards ML on the market. The A40 is not just a scaled up consumer GPU, and with machine learning growing as a hobby, consumer and enthusiast-level server GPUs are a surprising market gap.

  • Osayidan@social.vmdk.ca
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    1 year ago

    It’s a very new market demand that might not even be large enough yet. Until very recently nobody wanted a GPU for anything but gaming until crypto and then ML came along.

    I personally feel to get proper ML capable hardware at home we’ll need to see it more developed in enterprise first. Right now it’s still GPUs, the name doesn’t even reflect what people use them for so that alone speaks to the issue. They need to get A.I off GPUs and onto cards that may look and function a lot like a GPU but with none of the G. Once that happens it can hopefully trickle down like everything else does, hopefully quickly. A lot of people are priced out of learning/experimenting in this field right now.