Genuine inquiry . Maybe I am not experienced enough with the various federated platforms but I am an avid user of matrix, and have dabbled in lemmy. From what I have seen is federation is on the path to decentralization but not fully there. It creates fiefdom, little kingdoms . Great yes you may find one that suites you better, but users now can end up isolated to their island, switch island sure but now you are isolated for the previous island and maybe others. Its stupid. On matrix you need to know the other island(server) to even find its rooms(communities). Some rooms block users from one server while others block users of other servers. You either have to run multiple accounts or accept the limits. Add in you are at the mercy of your home server, you can lose your account have it immitated, and more. The performance is horrible not sure why, but content is slow to update and spread. Matrix has the problem because of its design most people are on the matrix.org server and so the point of federation is largely lost. They are moving to p2p where it seems the solutions for federation now dont apply.

Anyway why is federation not stupid? Are these problems only with Matrix? Cause I look at lemmy and it seems far worse.

  • lemm1ngsOP
    link
    fedilink
    arrow-up
    1
    ·
    3 years ago

    Note that I don’t know about the details on how comments/mentions between instances A and C are perceived by instance B.

    So I am wondering that too. How does content interacted with on A by C affect B. I know how B would want that which is not to see it. To me all these server to server rules are not ideal and should instead be the users themselves organizing on the platform setting the rules. I know you have that with each community but I am talking also platform wide. The instance does that with its blocks but thats very authoritaian and lacks nuance.

    Can you say more about how content moderation and codes of conduct work on a p2p network?

    I know how I would do it and how matrix is planning to do it. Matrix is planning to keep it the same as rooms currently do it which is the same as communities here. Server based blocking becomes quite pointless in p2p I think which leaves me wondering how matrix will handle spam as their current main method is to block servers. In p2p you can potentially have new servers continually appear and attack you so…

    I would use a moderation approach as like discussions.app is trying but that also really needs to also use how they organize content. The advantage is the approach is grassroots and nuanced with everyone getting the most unique moderation you could expect outside say some wonder ai doing it personally for everyone.

    • Liwott
      link
      fedilink
      arrow-up
      2
      ·
      3 years ago

      I know you have that with each community but I am talking also platform wide.

      Maybe to simplify the discussion let’s talk about a platform who is not subdivised in communities, like mastodon or facebook. Communities are already some kind of federation.

      should instead be the users themselves organizing on the platform setting the rules.

      Ok but nobody has the power to enforce the rules right? How do you deal with trolling and spamming? Does every user have to block every troll one by one?

      I would use a moderation approach as like discussions.app is trying

      which is?

      • lemm1ngsOP
        link
        fedilink
        arrow-up
        2
        ·
        3 years ago

        Communities are already some kind of federation.

        This is an excellent interpretation. On federated platforms there is federations inside federations. Its superfluous really, but it gets worse where communities get repeated.

        Ok but nobody has the power to enforce the rules right? How do you deal with trolling and spamming? Does every user have to block every troll one by one?

        Well it could depend on what you mean by enforce. There is moderators and they can enforce rules, it’s just each user is put kind of in an admin position to pick and choose the moderators. You can also have groups of moderators controlled by other users for any user to use, which is some part of how discussions.app is doing their ‘communities’ the other part is each community chooses what #tagged content to use. Aswell the instance could enforce certain moderators and infact some moderators must be set as default to keep the platform clean. The idea is these could be changed individually or with lists of moderators, or users just live inside the curated communities. The other thing is different instances could do it differently with a different set of default moderators.

        The whole point of this type of platform/moderation is to solve problems seen on others. You won’t be able to own a topic or community ‘as such’ as users must consent for you to have power. The reality is people are lazy, stupid and will consent to crazy thing as seen from the last year. So because moderators provide a service stopping spam and abusive behavior then people will use them and they will have power. What this really prevents is moderators being bad actors and also people not having good moderation or situations with no moderation when its needed. Because anyone can moderate there should be a much higher supply of moderation and types of moderation. The types of moderation I think is where things can get interesting because there is a heap of behaviours that could be hidden and platforms would be much nicer places, but really it’s up to the users what they want and how they experience things.

        • Liwott
          link
          fedilink
          arrow-up
          2
          ·
          3 years ago

          There is moderators and they can enforce rules, it’s just each user is put kind of in an admin position to pick and choose the moderators.

          Wait, if there are globally set modarators, how is this not a centralised network ? I mean ok it uses p2p technology so the data is not physically centralised on a single server. But the network itself, the graph of interactions, it is a single blob where every node is connected to every other. Or do I miss something?

          • lemm1ngsOP
            link
            fedilink
            arrow-up
            1
            ·
            3 years ago

            Moderators are globally set just on that one instance (or really it’s an interface or frontend), but also changeable by the users but also depending how the instance is setup. The way they are doing the data it is all shared in blockchains so in that sense it is a blob, but depending on each user and how they have that content curated would change the interactions user to user. In theory you will have groups oblivious to other groups but users within them that cross over between and become social bridges. The network is the people. My idea is that these human bridges will eventually lower barriers between users blocking each other and change minds. The end result is a more connected less divided social platform and so too society. Following the axiom that communication solves all problems. By putting the barriers to communication into the users hands instead of third parties he will have control to remove them.

            • Liwott
              link
              fedilink
              arrow-up
              1
              ·
              3 years ago

              Ok I’m not sure whether I understand how to mods are set.

              • if there are global mods who have the power to include to or exclude from the plaform, this is a centralised platform
              • if each user chooses one or more mods from which they automatically derive white and/or black list, this is a federated platform
              • if each user can only accept or block people for themselves, this is a decentralized platform (but you told me there are mods, so it cannot be that one)

              Do you agree with the above classification? If yes, which one is it?

              Maybe my question is equivalent to the following : what is the power of the mods?

              • lemm1ngsOP
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                3 years ago

                Its a difficult concept to get, one because it has yet to really exist in any meaningful way, like with a platform with say 50k or more users and because of the terminology used. Moderation is a bad describer of it except it has much the same use and effect but with differences. It would be better described as ‘filters’. Filters would also be a good describer for what are block and white lists.

                I dont particularly care on classifications, its just like different languages to me. To use your list its all 3, the instance can implement a central control but uses can switch instance and have the same content, or the instance may allow their control to be altered, like disabled or changed. Users can use lists made by others, to which they may as well help in making and users may also have their own lists. The concept should expand to include multiple lists block and white working against each other in priorities, and the lists work against not just users but any individual piece of content and it could and should I think be expanded to include editorial changes, annotations. Things like rather than hiding content pointing out problems with it, grassroots factchecking etc. To me its an incredibly adaptive and dynamic concept, which must be hard to implement or surely someone whould have tried long ago.

                what is the power of the mods?

                Currently on discussions.app content and users can only be hidden, the instance or interface has some moderators as fixed defaults for doing this but previously and the same I am told will occur soon is logged in users may unselect any of these moderators or add their own, and currently you can do your own moderation. Moderators can also be in control of a community and so using that community will use that moderation. The idea is all moderation is done on a consensual basis with each user, the platform then becomes individualized for each user. Rather than looking for instances you will just look for the communities and more so in the future the moderators. The simplest way would be you grow a follow list similar to your moderator list, but hopefully those lists become tradable at some point.

                TLDR

                Sorry for the long text. In short user controlled and collaborated and sharable filter lists is what I am talking about.

                • Liwott
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  3 years ago

                  Thanks for your detailed response! There are essentially three approaches that I’ve been taking to try to understand your model. Before that, a quick remark

                  I dont particularly care on classifications

                  Given the title of your post, I was assuming that there are platforms that you classify as federated and that you think it is a poor design choice. But I don’t really understand what you mean by that given that your rant includes quite a broad range of topics.

                  Storage

                  That I think I understand correctly that it works as a p2p network where everyone seeds a bit of everything. Do users control what part of the network they seed? If yes, then the issue that you might lose your content if someone else (in this case, everyone else) suddenly doesn’t want to share it anymore still exists.

                  If not, then isn’t any user an accomplice of the diffusion of whatever illegal content circulates? I don’t want to participate is sharing pedopornographic content !

                  Distribution

                  Is the app in your model based on an open protocol, that anyone can use to start their own network? Then what happens when people on two such networks try to interact with each other?

                  It has to be, otherwise it is clearly a centralised network that can be single-handedly shut down by its maintainer.

                  Network

                  Logically, the concept of shared blacklist seems to me to be equivalent to federation. If you publicly subscribe to a mod’s blacklist, it’s like if your were joining their instance on the fediverse. If you don’t, it’s like you were creating your own instance, but then you have to implement a blacklist yourself.

                  I understand the biggest difference is that it’s easier to “start your instance”, but that again implies everyone agrees to seed your content. I would not seed anyone’s content if there is no code of conduct they have to obey. And that seems to logically yield users only seeding content approved by their chosen mods, which brings us back to a federated storage, except that each instance’s data is stored in p2p rather than in a centralized server. But the instance mod still has the same power as in the fediverse case.

                  • lemm1ngsOP
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    3 years ago

                    So on that classifications comment it’s me not caring about calling a concept something vs discussing what something is already understood as a concept. There is a concept of what federation is, and that is represented in existing federated platforms like lemmy, matrix and so on. In that concept I see stupid things but it’s not all the same nor all of the concept stupid.

                    So storage on discussions.app is currently using open blockchains. So its the blockchain that is responsible for what is there, while they are only using that for text, blockchains have had childporn put on them. It is next to impossible to remove data from a blockchain. So that is an issue. I think the data would be better stored in something like ipfs. This type of platform relies on the backend for its distributedness, like with a blockchain. Anyone who can access the backend can be part of the platform.

                    Yes the blacklists would result in a federation of sorts. I think the idea you store and distribute mainly content you use is a good one. The data has to be somewhat unfiltered for the model to work and in that part you will potentially be distributing content you may not want to. Though the situation would be some what a kin to putting encrypted content say here that people wouldn’t like except for the fact they cant know that.