First, a disclaimer: I work somewhere that is relevant to this topic1 so I want to be extra clear that I am only communicating my personal views.

Les seems to be (maybe! he can reply if I’m wrongly interpreting) thinking about the sorts of responsibilities We The Public have assigned to entities like social media companies without them really, uh, rising well to meet the challenge. I have been thinking a lot recently about Parler and particularly about how misunderstandings about “digital space” imply very problematic things because they’re not tied to how the actual internet works.

So when I’ve been thinking about this kind of thing recently, I’ve been having very similar ideas to Les on this part:

It goes something like this - freedom of speech does not imply a right to amplification.

The former is your unfettered ability to speak using your own capacity. The latter is others relaying, repeating, augmenting your speech.

I believe the former is an individual right - balanced by the right of others’ expression.

The latter is not a right - because it would essentially demand others be enslaved in service to your speech.

The comparisons are clear. You’ve always had a right to go shout on a sidewalk. As when, say, to pick a company not carefully at all, Twilio drops Parler, that’s fine, because you’ve never had a right to force a publisher to carry your screed on Algerian mind control tomatoes.

And yet.

And yet.

Put differently, I don’t think you get to be preternaturally loud without the help & consent of others. And I think maybe there should be accountability for providing that help & consent.

I think this runs into conflict with notions of common carriage and safe harbor. But I’m not sure these are unalloyed goods. We’re building huge, largely unsupervised event spaces that have become chaotic attractive nuisances. They’re like empty swimming pools in vacant rental properties - but with scant accountability for the landlord when a kid falls in and cracks their skull.

I think this is a fair analogy, but not necessarily a complete analogy. I’ve written out and deleted about five different ideas about why at this point, so I’m going to just give you one for now and it may not be that well-worded.

As easy as it is to say that private internet companies are enacting private choices just like an absentee landlord on their own land, there is an aspect here where this doesn’t quite match.

You know, there is a concept about public data networks. I’m told the term kind of died once we got to the internet, but I can’t help thinking that it’s a meaningful concept. The internet was publicly funded, of course, at various times in its development. More than other equivalent research there’s something public about it that we have to acknowledge. The internet is better for being an everyone network. It doesn’t have to be an unalloyed good for there to be some aspect of the good that is tied to its access being public, and that the public benefits from.

There is therefore some real interest we have in making sure that all children are at least free to traipse about on unfenced properties in a sense, which doesn’t quite match the metaphor.

I want there to be some people who do have responsibilities to provide networked computer services with equal availability for all. That work is nobler for its being equally accessed, even if that does mean some awful people benefit from it. Awful people benefit from water treatment facilities too, or phone lines to let them call their awful loved ones. I’m at peace with that. I want a gay kid in a podunk town to get the same big gay internet the rest of us make great even if their local authorities aren’t keen on the idea.

At the same time, we’re going about it in the exact wrong way when we can see columnists at the national level bemoaning that the U.S. President has been silenced because his Twitter account was suspended. If he wants to hire his own people to hook up his own computers to the internet, he has enough money to do it, and enough people to hire from.

(…well, Parler was apparently one giant Wordpress install, so maybe the tech community, they’re not sending their best… but you don’t need startup energy or BigCo talent to serve out a text file of whatever he would have been tweeting, which answers the important freedom of speech question here.)

Anyway, I’ve been typing enough out here that I have about as much saved in abortive paragraphs in another file, so I’ll stop for now. Suffice to say that this is really important stuff, and I think more tech people should be talking about it publicly because we’re in the position of understanding the power the industry does and doesn’t have.


1: I have literally zero internal knowledge about my employer’s relevant involvement or decisions. The internal knowledge about other stuff that I do have from working there is not at all referenced in any of this, so I am merely Jane Q. Public, cloud-knowledgeable techperson.

  • zkikiz
    link
    fedilink
    arrow-up
    3
    ·
    4 years ago

    I think that, interestingly, it seems that decent human moderation doesn’t scale. The big companies seem to just be auto-banning any mention of bad words while allowing actual fascist behavior to continue unopposed (because fascists with more than two brain cells don’t use the words that describe them. Abusers don’t say “I’m going to abuse you!”)

    The three most workable moderation strategies appear to be Reddit’s, Mastodon’s, and email/spam/abuse scoring. FB/Twitter’s model seems very flawed even if technically compliant.

    Of course more laws on the topic will just lead to regulatory capture and monopolies. But I think we as internet denizens can see the right way forward.

    • roastpotatothief
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      4 years ago

      I guess both exist, the carrier and the publishers. Maybe there needs to be a legal distinction - so a service can declare itself one or the other, and then follow appropriate rules.

      What would really happen if people on a “carrier” started anonymously posting bad stuff - like blackmailing, doxing, threatening? What model would prevent that?

      • Maybe a quorum or users could ban the offending user. To stop him immediately joining again, you could have a waiting list to join the service.
      • Maybe you by default block content from strangers. You just follow the people you know.
      • Maybe there’s no problem to solve. These bad things are already crimes and IMO they are very rare. Communities and police already have ways of dealing with them.

      Imagine this scenario. “A president want to suppress some scandal (he is abusing his office to steal from orphaned children) so he orders the arrest of the journalist who is trying to expose him. He uses a carrier communiation channel to communicate and enforce this arrest warrent.” This is an extreme example - any robust solution should be able to deal with that scenario.

      BTW the banning of bad words is IMO just the only censorship which (today) can easily be fully automated. The services which today have forbidden words, they will have forbidden ideas as soon as the technology exists to automate that.

      • roastpotatothief
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        4 years ago

        …But TBH this post touches on a few topics - all of them difficult - none can ever have clean answers. It’s confusing because it’s a jumble of ideas.

        But they are all important and interesting. And what impressed me from this post was that there is so little groupthink here on Lemmy. People with opposite opinions happily debate each other.

        So I suggest you make an individual post about each topic. If you can figure out each answer, maybe you can tie them all together.

        • ability to communicate freely and effectively - a journalist should not be denied his platform - a local pub should have the same ability to advertise an upcoming gig, as a club owned by the mayor has
        • ability to prevent/punish harmful speech/actions - like publishing the addresses of civil rights activists along with bomb making instructions.
        • preventing accidental harm - like children seeing sword-swallowing tutorials
        • the value of the commons - having conversations with strangers and exploring ideas, somewhere your employer can’t eavesdrop.
        • the right to do wrong - to be a gay in the 1950 or to be nazi today - it’s impossible to know what is truly bad, and what people in the future will realise is really fine. Therefore we must tolerate things we consider bad.
        • the internet (and effective communiation) as a public service - a right. We can punish bad people for crimes, but we still don’t deny them human rights.