For instance, say I search for “The Dark Knight” on my Usenet indexer. It returns to me a list of uploads and where to get them via my Usenet provider. I can then download them, stitch them together, and verify that it is, indeed, The Dark Knight. All of this costs only a few dollars a month for me.

My question is, why can’t copyright holders do this as well? They could follow the same process, and then send takedown requests for each individual article which comprises the movie. We already know they try to catch people torrenting so why don’t they do this as well?

I can think of a few reasons, but they all seem pretty shaky.

  1. The content is hosted in countries where they don’t have to comply with takedown requests.

It seems unlikely to me that literally all of it is hosted in places like this. Plus, the providers wouldn’t be able to operate at all in countries like the US without facing legal repercussions.

  1. The copyright holders feel the upfront cost of indexer and provider access is greater than the cost of people pirating their content.

This also seems fishy. It’s cheap enough for me as an individual to do this, and if Usenet weren’t an option, I’d have to pay for 3+ streaming services to be able to watch everything I do currently. They’d literally break even with this scheme if they could only remove access to me.

  1. They do actually do this, but it’s on a scale small enough for me not to care.

The whole point of doing this would be to make Usenet a non-viable option for piracy. If I don’t care about it because it happens so rarely, then what’s the point of doing it at all?

  • Darkassassin07@lemmy.ca
    link
    fedilink
    English
    arrow-up
    125
    ·
    10 months ago

    They do receive takedown notices, however files uploaded to usenet are mirrored across many providers across many jurisdictions while also split into many parts as you noted. Usenets implementation of file sharing is quite robust; being able to rebuild a file that’s missing a significant portion of it’s data. To successfully take down a file, you need to remove many of these parts across almost all of the usenet backbones which requires cooperation across many nations/jurisdictions that are governed by varying laws. It’s not an easy task.

    Here’s a somewhat limited map of usenet providers:

  • _number8_@lemmy.world
    link
    fedilink
    English
    arrow-up
    114
    arrow-down
    6
    ·
    10 months ago

    why is the DMCA the one fucking law that actually gets enforced at a high rate when there are literally billions of things more important that we could spend money on

    • Diplomjodler@feddit.de
      link
      fedilink
      English
      arrow-up
      42
      arrow-down
      2
      ·
      10 months ago

      Because the corporations that benefit from this law can afford to buy lots of politicians.

    • pelletbucket@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      it’s not generally law enforcement enforcing it, it’s the copyright holders threatening civil action against things like internet service providers, who in turn will cut off your internet or some such. they have a lot of money, so they get law enforcement to do their bidding when they want, but the majority of DMCA action is civil action. this is, my very uneducated opinion looking from the outside

  • DosDude👾@retrolemmy.com
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    1
    ·
    10 months ago

    As far as I know, they do get dmca’d. But they delete a single file so it’s incomplete. But if you have 2 different newsgroup providers they usually didn’t delete the same file, so you can still download it.

    But I could be totally wrong because I haven’t really looked into this, and this is all from a very old memory.

    • ShitpostCentral@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      10 months ago

      That makes some amount of sense. I’m not sure exactly how each article is stitched together to create the full file. Do you happen to know if it’s just put together sequentially or if there’s XORing or more complex algorithm going on there? If it’s only the former, they would still be hosting copyrighted content, just a bit less of it.

      EDIT:
      https://sabnzbd.org/wiki/extra/nzb-spec
      This implies that they are just individually decoded and stitched together.

      • grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        10 months ago

        Do you happen to know if it’s just put together sequentially or if there’s XORing or more complex algorithm going on there? If it’s only the former, they would still be hosting copyrighted content, just a bit less of it.

        Copyright is a legal construct, not a technological one. Shuffling the file contents around doesn’t make the slightest bit of legal difference, as long as the intent is to reconstruct it back into the copyrighted work.

        (Conversely, if the intent was to, say, print out the file in hexadecimal and wallpaper your house with it, that wouldn’t be copyright infringement even if you didn’t rearrange it at all because the use was transformative. Unless the file in question was a JPEG of hex-digit wallpaper, of course.)

  • Nollij@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    36
    ·
    10 months ago

    First, a massive amount of content is removed. You won’t find a lot of popular, unencrypted content these days on usenet. It’s all encrypted and obfuscated now to avoid the bots

    Speaking of bots, I don’t think you realize how much of this process is automated, or how wide of a net is being used. The media corporations all have enormous collected libraries of material. It gets posted constantly to all sorts of places. This includes public torrents, public usenet, YouTube, PornHub (yes, really, even for non-porn), Facebook, TikTok, Tumblr, GNUtella, DDL sites…

    The list goes on and on. Each one gets scanned for millions of potentially infringing items, often daily. No actual people are doing those steps.

    Now, throw in things like private torrents, encrypted usenet posts, invite-only DDL, listings that use ‘3’ instead ‘e’ or those other character subscriptions… These require actual humans to process. Humans that cost money, and a considerable amount of it. As a business, you have to show a return on investment. Fighting piracy, even at its theoretical best, doesn’t increase revenues by a lot.

    You mention revenue and breaking even, but you left out an important detail. Your time is free. They don’t have to pay $10/month, they have to pay $10/month + $20/hour for someone to deal with it. And most pirates of that level will just find another method.

    • Darkassassin07@lemmy.ca
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      10 months ago

      You won’t find a lot of popular, unencrypted content these days on usenet. It’s all encrypted and obfuscated now to avoid the bots

      That’s not been my experience at all. Pretty much everything I’ve looked for has been available and I rarely come across encrypted files. I do regularly have to try 2 or 3 nzbs before I find a complete one, but I almost always find one.

      • Nollij@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        7
        ·
        10 months ago

        Are they obfuscated in any way? Depending on your client, you may not be able to see the names and subjects. But if you didn’t have the NZB, is there any real chance you could find it otherwise?

        • Darkassassin07@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          10 months ago

          But if you didn’t have the NZB, is there any real chance you could find it otherwise?

          No, but that’s just the nature of NZB file sharing. The individual articles aren’t typically tagged/named with the actual file names, that info is pulled from the NZB and the de-compressed + stitched together articles.

          I’m not using any special indexers, just open public registration ones. The NZBs aren’t hard to find, for me or for IP claimants.

    • Evotech@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 months ago

      I remember spesifically game of thrones. If you didn’t download the episode within a day or so of being released it was DMCAed and gone from most popular usenets

      Not seen that with anything else tho

      • jeeperv6@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        HBO used to be bad. Going back to the Deadwood or True Blood days… the show would air, within 20 minutes of the end credits, it’d be up on Usenet. You had to start grabbing it ASAP, HBO’s sniffers would be on the look out within an hour. And they’d take down just enough parts to make the PARS useless. I don’t miss the UUEncode days and the refreshing a newsgroup every few minutes to see if my show got posted (or re-posted) days.

  • Z4rK@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    10 months ago

    You forgot the first rule of Usenet: We do not talk about Usenet. That’s why it works. Keep talking and see what happens.

  • thantik@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    2
    ·
    10 months ago

    I mean, how does usenet compare to just pulling torrents from public trackers?

    Is there a good way of searching to find out if something is available before diving in?

    I don’t really know how usenet compares - and private trackers seem quite a pain and haven’t ever really had anything I couldn’t find on public trackers anyways.

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      10 months ago

      If what you want is recent within the last 10-20 years or older but popular, Usenet is far superior, if it’s available it will always download as fast as your connection will allow Vs torrents which also needs seeders in addition to simply being available, and then ofc you’re beholden to those seeder’s upload bandwidths.

      If what you want is obscure and/or old, torrenting is probably your best bet, especially since you can just leave the download “open” and download it byte by byte over months if you so choose. But even then, it’s still worth checking Usenet since you never really know what people will upload and when. If someone reuploads something obscure/ancient and then is never seen again it doesn’t matter, you’ll still be able to download it fast until it’s removed or until it’s retention expires (about 10 years for the good providers)

    • Darkassassin07@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 months ago

      2 main advantages:

      • no hosting liability. Unlike torrents; you’re not seeding ie hosting the files yourself. You’re purely downloading. This moves you out of the crosshairs of copyright holders as they are only interested in the hosts (providers). This also means a VPN is not necessary for usenet downloading. (providers don’t log who downloads what either)

      • speed. As long as the content is available (hasn’t been removed due to dmca/ntd), you are always downloading at the maximum connection speed between you and your provider. No waiting/hoping for seeds and whatever their connections can provide. I’m usually at around 70mb/s. Where as torrents very very rarely broke 10mb/s for me, usually struggling to reach 1mb/s.

      As far as availability goes, stats from my usenet client: of 17m articles requested this month, 78% were available. I’m only using a single usenet provider. That availability percentage can be improved by using more than one in different jurisdictions (content is difficult to remove from multiple servers across different regions).