Isn’t it possible to “find” the most valuable website in the web with the help of a well mixed community? I am thinking if an small browser add-on which can share the basic url of a visited websites with a website scrapper. The scrapper can then index the whole website with its sub pages. The add-on could be installed independently by users which would like to strengthen the network.

Besides setting up an own search index, one could try to export search results from #Google and Bing as a ramp up help, which are similar to #startpage and #duckduckgo. I mean I am no search engine expert, but is there so much more magic?

  • @matlOP
    link
    62 years ago

    Yeah you are right with many points. I do not have any special plans - I am just annoyed by Google and Bing or also now DDG that you are always spied on different services in the web. While we have open and distributed networks such as Mastodon, Pixelfed and many more, we do not have a usabable federated search engine. I know YaCy from long time ago, but I think its also still not competitive yet. As Searx is just a meta search engine built on top of the known emgines, it is also not an alternative.

    • comfy
      link
      62 years ago

      we do not have a usabable federated search engine

      Sepia Search [link],[wikipedia] might be interesting to you, it’s for PeerTube videos across many instances.

      As Searx is just a meta search engine built on top of the known engines, it is also not an alternative.

      I’m glad you noticed, I’m annoyed by all the people who kept recommending it as an alternative to DDG when a recent complaint came out about them censoring more results!

      There might be some open-source search engines that you could look into to answer some of these questions, I’m a bit too tired now to research that. I know Sepia Search is, but maybe some more general purpose ones are too.