In recent weeks, I came across Adaptive Resonance Theory that attempts to solve similar problems to what the mainstream backpropagation based machine learning and deep learning attempt to solve. There’s also a 2017 paper on DeepART. The interesting part is the claim that ART overcomes much of the problems associated with backpropagation. Inspite of this, in my experience of the last 5 years of seeing machine learning, I’ve seen backpropagation much more and almost never seen adaptive resonance theory, and google trends seems to agree with my experience.

So, are there inherent problems that adaptive resonance theory does not solve, for which backpropagation-based machine learning has worked out pretty well? Or is it just another coincidence in the tech industry for why something won out in favour of something else?

  • swiftessay@lemmygrad.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    There are many proposed alternatives to backprop-based neural nets. But it’s difficult to pierce through the noise when everyone is one thing that is practical to do, there is a bunch of tools and collective knowledge do draw upon.

    It takes more than showing similar results to deep learning in order to people to get interested in it. It’s a shame but it’s part of the socio-political aspect of research.