• mcherm@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    I found that to be pretty insightful.

    I completely agree with the analysis that the ability to search is in tension with privacy and a guarantee that posts will be forgotten. Allowing individual posts to declare how they should be shared is a good idea.

  • 0x1C3B00DAOP
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    2 years ago

    I guess I fall into the author’s webhead category and I still don’t understand the issue with search. I’m specifically talking about third-party search engines, not built into AP software.

    All fediverse posts on all software (that I’ve used or know of) have an ActivityPub representation (the json blob) and an html representation (the page you see if you click the post’s url). Search has been a part of the web as long as I’ve been using it and as a commenter on the post said, decentralization is heavily dependent on indexing/search. Without it, you have a major discoverability problem, which is a consistent critique of the fediverse.

    I’ve seen ppl mention users have an expectation of privacy, but I find that hard to believe. Post from multiple social sites (Twitter, Instagram, Pinterest, Tumblr, Reddit, etc) are indexed in most search engines and users are used to that and know that public posts are searchable. As a user, you have multiple tools to keep your posts from being indexed and most fediverse software has decent moderation tools so you can handle any incoming issues that do occur.

    EDIT: A relevant comment from the HN post:

    It’s also unethical to make promises to users, like privacy for posts published to a semipublic social network, that software can’t possibly keep.

    • Ada@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      4
      ·
      2 years ago

      I think you’re missing some important context here. What people are looking for isn’t “no search”. They’re looking for a balance between discoverability and public posts that allows them to exist and connect to new people without being trivially found by harassers.

      Remember, the fediverse was first largely populated by queer folk, queer folk who were generally escaping social media platforms with full text search that had become toxic for them. Yet they didn’t arrive with a full network of friends either. They needed some form of discoverability.

      And that’s why we see tag search as the limit. Yes, full text search can exist and be done, but even the act of making it one step removed from the app used to make comments shifts the balance in the right direction. When the same app that is used to harass is used to discover, then drive by harassment is trivial and common. When searching for content involves some level of effort, and exists independent of the platform that is used to harass, then drive by harassment goes down. Of course, that doesn’t stop dedicated bigots with less casual harassment in mind, but that’s an acceptable balance for most folk

      • ragica
        link
        fedilink
        arrow-up
        3
        ·
        2 years ago

        Security through obscurity raises it’s head again… we’ve been down this fraught road so many times.

        If control is wanted over this privacy/discovery balance, better build it in fast. Or, third parties that prove themselves significantly more useful than what is built in will soon take over, once the network reaches a significant size. Search becomes a key feature of every network and communication/sharing platform there is.

        Unless of course the hope is that by limiting the utility of the network it remains small and therefore obscure and less used as a whole.

        The linked article is thoughtful and covers many of these points from multiple sides already though.

        • Ada@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          4
          ·
          2 years ago

          Nah, it’s not security through obscurity. No one thinks their posts are secure. The goal is to not make them trivially available to people with intent to cause harm. Yeah, bad faith indexing instances will always exist, but they don’t enable trivial drive by harassment, and that’s the status quo people are trying to keep.

          • Ada@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            2 years ago

            Again though, that’s not what people are worried about, because it’s not where most harassment comes from. On twitter for example, people would have saved search terms, and then almost in real time, they’d drop in and harass people talking about whatever topic they want to troll. That’s the behaviour people are trying to stop, and someone being able to hit up google to track a specific user or instances content down isn’t enabling that behaviour.

  • Arthur BesseA
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    2 years ago

    This is a nice overview of this absurd situation, but Tim Bray’s conclusions are a little surprising to me.

    Yes, Mastodon traffic either is already or soon will be captured and filed permanently as in forever in certain government offices with addresses near Washington DC and Beijing, and quite likely one or two sketchy Peter-Thiel-financed “data aggregation” companies. That’s extremely hard to prevent but isn’t really the problem: The problem would be a public search engine that Gamergaters and Kiwifarmers use to hunt down vulnerable targets.

    Here Bray appears to be missing the fact that those people will often end up with access to those Thiel-financed private intelligence services that will have the full-text search, while the rest of us won’t. Making things public and pretending they’re private by shunning search effectively lobotomizes everyone who abides by this custom, while still allowing the worst people to have the capability (and not only the ones working in state intelligence agencies).

    What success looks like: I’d like it if nobody were ever deterred from conversing with people they know for fear that people they don’t know will use their words to attack them. I’d like it to be legally difficult to put everyone’s everyday conversations to work in service to the advertising industry. I’d like to reduce the discomfort people in marginalized groups feel venturing forth into public conversation. (emphasis mine)

    This is a conflation of almost entirely unrelated issues. The first half of the first sentence is talking about non-public conversations. The solution there is obviously to use e2e encryption, so that even the servers involved can’t see it, and to build protocols and applications that don’t make it easy for users to accidentally make private things public (ActivityPub was not designed for private communication, it was designed for publishing, so, it is unlikely to ever be good at this). The second sentence is about regulating the ad industry… ok, cool, an agreeable non-sequitur. But the last sentence is talking about public conversation… and in the context of the second half of the first sentence, it carries the strong implication that Bray somehow entertains the fantasy that conversation can somehow be public and yet be uninhibited by “fear that people they don’t know will use their words to attack them”.