I think Benedict Evans writes about a lot of really interesting stuff. Sometimes he gets right to the hearts of things. Sometimes he’s wrong in important (and interesting!) ways.

This seems to me to be an example of the latter.

However, it often now seems that content moderation is a Sisyphean task, where we can certainly reduce the problem, but almost by definition cannot solve it. The internet is people: all of society is online now, and so all of society’s problems are expressed, amplified and channeled in new ways by the internet.

Fully agreed! Yes! Absolutely–technical problems are rarely just technical problems, but also social problems.

We can try to control that, but perhaps a certain level of bad behaviour on the internet and on social might just be inevitable, and we have to decide what we want, just as we did for cars or telephones - we require seat belts and safety standards, and speed limits, but don’t demand that cars be unable to exceed the speed limit.

This, however, does not follow, and it doesn’t follow even for cars. It took a lot of corporate manipulation of people’s beliefs for us to start thinking about car crashes as “accidents”. It took intense lobbying to create the crime of “jaywalking” where before, people had been allowed to walk in the streets their taxes paid for, and people driving cars had been responsible for not hitting others.

Powerful entities had it in their interest to make you believe this was all inevitable. People made a lot of money from making us think that this is all just How Things Are, that we have to accept the costs and deaths. They’re still making a lot of money. Even those seat belt laws exist because the auto lobby wanted to get out of having to build in airbags.

Automotive technology is technology just like the Internet is technology. Where technology lets us leap over natural physical limitations, “human nature” isn’t an inherent fundamental to the situation. Why did we build the cars to go fast? Why do people assume they should be able to get around faster in a car than on a bike, even around pedestrians? If I write a letter that tells you to kill yourself and have a print shop blow it up into a poster, is the print shop at all responsible for their involvement in my words? What if they put out a self-service photocopier and choose not to look at what people are using it for? Is it different if it’s not a poster but a banner ad? A tweet? Sure, we can acknowledge that it’s some part of human nature that we’re going to be shitty to each other, but should we be helping each other do it at 70 miles per hour? The speed of light? These are uncomfortably political questions, questions that have power tied up in them.

And that’s exactly why I think it’s important to reject Evans’ thinking here.

Some people argue that the problem is ads, or algorithmic feeds (both of which ideas I disagree with pretty strongly - I wrote about newsfeeds here), but this gets at the same underlying point: instead of looking for bad stuff, perhaps we should change the paths that bad stuff can abuse. The wave of anonymous messaging apps that appeared a few years ago exemplify this - it turned out that bullying was such an inherent effect of the basic concept that they all had to shut down. Hogarth contrasted dystopian Gin Lane with utopian Beer Street - alcohol is good, so long as it’s the right kind.

Of course, if the underlying problem is human nature, then you can still only channel it.

He does not argue in the linked piece that algorithmic newsfeeds are worth their bad effects, only that they’re a response to a real problem – that’s why I liked the linked piece!

Let’s not make fuzzy comparisons, even with tongue in cheek; Dickens was quite right to note that the “great vice” of “gin-drinking in England” arose out of “poverty”, “wretchedness[,] and dirt”, which are no more human nature than all the riches of Silicon Valley… and as a non-teetotaler I am free to add without fear of being thought a nag that any quantity of alcohol is bad for your health. There aren’t inherent inducements to good or evil in beer or gin. The existing context is too important, and someone’s getting rich off of selling you either.

I’m not even sure I believe that we can know anonymous messaging inherently leads to bullying, only that the populations who seize upon it in our preexisting imperfect context are using it toward that end.

But if you’re willing to believe that YikYak had to die, why then believe that an engagement-maximization framework – algorithms harvesting your eyeballs – is not having significant impact on the way we interact with each other? Is this guy invested in Facebook? Did any philosopher, pessimist or optimist, imagine like count displays in their state of nature?

Ah, blech, the guy’s got a history in VC. I shouldn’t have opened the Twitter to try to confirm pronouns. There’s a very sad genetic fallacy (well, heuristic) we could apply here but I’m too busy to let myself be saddened by its conclusions.

  • @Manmoth
    link
    1
    edit-2
    3 years ago

    deleted by creator

  • @roastpotatothief
    link
    13 years ago

    Lots of interesting ideas there. I can answer one of your questions though.

    is the print shop at all responsible

    That just depends if you legally define it as a service or a utility. Is the role to help you with individual jobs, or provide a tool for you to use yourself. It’s like if you hire a rental car to rob a bank or a chauffeured car. Chauffeur responsible, car lender not.

    • MayaOP
      link
      23 years ago

      A question meant to provoke contemplation of ethics rarely has one answer, and this certainly doesn’t. “Service” and “utility” are concepts used in the legal system, and are not cleanly inherent to the situations where they’re applied. Legal systems aim to provide clear delineation of liability, but they can’t be considered the be-all and end-all of morality. Rather, when we ask ourselves “has the print shop done something wrong here” we must also consider “should the government allow the print shop to do this? should I shop at a print shop that does this?” etc. etc. as related but separate questions. (“Responsibility” also has a lot of shades by most reckonings)

      • @roastpotatothief
        link
        13 years ago

        Your post certainly made me think about a few things.

        One that one point though, I find that simple dichotomy completely sufficient.

        And both need to exist. You don’t want a world where people can’t get secret or personal things printed without the shop owner looking over his shoulder. You also need a service o exist where the staff will help you with difficult printing jobs.

        And it’s a duality that’s really universal too. Works well across diverse businesses and other areas of life.

        And my point from my other post - remember to think about the good that can come from a particular freedom - as well as the bad. This one is difficult so I am always having to remind myself.

    • @roastpotatothief
      link
      13 years ago

      And another thing. Anonymous forums are a technology like any other. Like photocopiers, they can be used fire good and bad… And are every day. If you ban them to rid the world of the bad, then you also lose the good. Everything is sometimes abused, and when it is there is a temptation to ban it. Each time, we also lose one degree of freedom in which we can do good. By removing a vice you make the world less virtuous and more vicious.

  • @poVoq
    link
    1
    edit-2
    1 year ago

    deleted by creator

    • @Manmoth
      link
      13 years ago

      I think a working model already exists on 4chan of all places.

      There is a ton of garbage I do not want to see on 4chan but they’ve provided an excellent tool - filtering. If I use the filter I can see all of what I want and none of what I don’t and I can do it without any help. In the process of filtering I created the world that I want to participate in without diminishing that in which I don’t. This model makes the user responsible for BUILDING THEIR OWN echo chamber instead of having one given to them. A DIY worldview curation.

      Every site would be better if managed this way – even this one. Short of illegalities, the only ‘moderation’ that really matters is organizing discussions by topic and even that can be automated and customized to a certain extent. People are trying to police each other too much. If I don’t like something I just open my filter add the slur, term, topic, memephrase, whatever and I’m done with it.

      • @roastpotatothief
        link
        23 years ago

        That’s it. That has to be the right answer.

        But what about doxing, blackmail, what if i publish the address of a civil rights protester, along with pipe bomb making tips?

        There must be some role for moderation too.

        • @Manmoth
          link
          0
          edit-2
          3 years ago

          Illegalities aren’t really negotiable because they’re illegal. Platforms have to take reasonable measures to those ends. I think Minds.com handles this stuff with an actual “jury” of users.

          • @roastpotatothief
            link
            13 years ago

            I don’t see how the legality is important. Do you mean that it’s not the platform’s responsibility to deal with that stuff because the police should handle it instead?

            The police can’t do anything in this case, because the post was anonymous.

            • @Manmoth
              link
              0
              edit-2
              3 years ago

              My point is that if the speech is somehow illegal, which (in the United States at least) is a definition so narrow that it almost doesn’t exist, then sure the platform should do something about it even if it’s just deleting the post.

              • @roastpotatothief
                link
                13 years ago

                Got it. So moderation does happen, but only in extreme cases.

                Normally, the user is expected to moderate his own filter bubble, encounter only the content he wants. So the platform has tools (lots of types of tags, I guess) to facilitate that.