RSS is still the best way to track the news on the web, and these RSS readers can keep you right up to date.

  • AggressivelyPassive@feddit.de
    link
    fedilink
    arrow-up
    7
    ·
    11 months ago

    Even as a link aggregator that would be perfectly fine for me personally.

    What really bugs me is that many news sites don’t keep their feeds clean, so you often have duplicates and most importantly: if you have multiple sources, you’ll get multiple copies of the same information packaged slightly differently - often I’m not even interested in one copy.

    For example, all news outlets had some Grammy/Taylor Swift crap in their feeds. Each outlet had like three different articles, all regurgitating the same information. I would love to have something like topic clusters, so that I could discard all articles I’m not interested in in bulk.

    I even tried building it myself, but wasn’t very successful.

    • SpectralPineapple@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      11 months ago

      I don’t see how RSS could identify, prioritize, and remove duplicates between different sources in the same category. If I understand correctly, those are not really duplicates, but rather different articles on the same subject. Unless you are talking about a more complicated system or manual curation, I don’t think that is possible. I don’t believe I had much trouble with duplicates within the same feed, maybe I never subscribed to many feeds that do that.

      • AggressivelyPassive@feddit.de
        link
        fedilink
        arrow-up
        3
        ·
        11 months ago

        It’s possible by analyzing the title and subtext (and the article snippet, if it exists). I tried to have an AI model estimate the likeness of articles. Worked relatively well, but I lack the motivation to build it out into a usable app.