Isn’t it possible to “find” the most valuable website in the web with the help of a well mixed community? I am thinking if an small browser add-on which can share the basic url of a visited websites with a website scrapper. The scrapper can then index the whole website with its sub pages. The add-on could be installed independently by users which would like to strengthen the network.

Besides setting up an own search index, one could try to export search results from #Google and Bing as a ramp up help, which are similar to #startpage and #duckduckgo. I mean I am no search engine expert, but is there so much more magic?

  • comfy
    link
    fedilink
    arrow-up
    6
    ·
    2 years ago

    we do not have a usabable federated search engine

    Sepia Search [link],[wikipedia] might be interesting to you, it’s for PeerTube videos across many instances.

    As Searx is just a meta search engine built on top of the known engines, it is also not an alternative.

    I’m glad you noticed, I’m annoyed by all the people who kept recommending it as an alternative to DDG when a recent complaint came out about them censoring more results!

    There might be some open-source search engines that you could look into to answer some of these questions, I’m a bit too tired now to research that. I know Sepia Search is, but maybe some more general purpose ones are too.