• Masimatutu@lemm.eeOP
      link
      fedilink
      arrow-up
      38
      ·
      edit-2
      1 year ago

      I’d say roughly 1,000 to 100,000, depending on format.

      Edit: Raw ASCII (7-bit) could give you up to ~half a million.

      Edit 2: According to Randall Munroe (to lazy to find the source), you could theoretically store one word letter per bit. That would give us up to ten two million books.

      • takeda@szmer.info
        link
        fedilink
        arrow-up
        22
        ·
        1 year ago

        Edit 2: According to Randall Munroe (to lazy to find the source), you could theoretically store one word letter per bit. That would give us up to ten two million books.

        I don’t see how that is possible, I think it is be one letter per byte.

        Bit only represents one state 1 or 0, or true or false. It is too little information to store a letter.

        • Masimatutu@lemm.eeOP
          link
          fedilink
          arrow-up
          15
          ·
          1 year ago

          Here ya go:

          Based on the rates of correct guesses—and rigorous mathematical analysis—Shannon determined that the information content of typical written English was around 1.0 to 1.2 bits per letter.

          https://what-if.xkcd.com/34/

        • Doctor xNo@r.nf
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          That’s bit, a letter or character is a byte (8 bits), this is about right for pure text files that have no overhead, any extra info (like font, size, type, anything except which chatacter…) Is extra bytes, of course.

          • jaybone@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            1 year ago

            If we’re only talking 26 letters no caps, we can cut that down to 5 bits. Then use a decent compression algorithm. Someone more bored than I am can do the math.

            • Masimatutu@lemm.eeOP
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              1 year ago

              five bits would only leaves us with six punctuation marks (including spaces, and we don’t get any numerals either) though, do you think that’s enough? i certainly don’t; i have not even used a full stop and I have already exceeded it!

      • Sotuanduso@lemm.ee
        link
        fedilink
        English
        arrow-up
        16
        ·
        1 year ago

        One letter per bit? You’d need some crazy effective compression algorithm for that, because a bit is 1 or 0. Did you mean byte?

        • AdrianTheFrog@lemmy.world
          link
          fedilink
          English
          arrow-up
          15
          ·
          edit-2
          1 year ago

          UTF-8 and ASCII are normally already 1 character per byte. With great file compression, you could probably reach 2 characters per byte, or one every 4 bits. One character every bit is probably impossible. Maybe with some sort of AI file compression, using an AI’s knowledge of the English language to predict the message.

          Edit: Wow, apparently that already exists, and it can achieve even higher of a compression ratio, almost 10:1! (with 1gb of UTF-8 (8 bit) text from Wikipedia) bellard.org/nncp/

          If an average book has 70k 5 character words, this could compress it to around 303 kb, meaning you could fit 1.6 million books in 64 gb.

          You can get a 2tb ssd for around $70. With this compression scheme you could fit 52 million books on it.

          I’m not sure if I’ve interpreted the speed data right, but It looks like it would take around a minute to decode each book on a 3090. It would take about a year to encode all of the books on the 2tb ssd if you used 50 a100s (~$9000 each). You could also use 100 3090s to achieve around the same speed (~$1000 each)

          52 million books is around the number of books written in the past 20 years, worldwide. All stored for $70 (+$100k of graphics cards)

          • Sotuanduso@lemm.ee
            link
            fedilink
            English
            arrow-up
            10
            ·
            1 year ago

            There’s something comical about the low low price of $70 (+$100k of graphics cards) still leaving out the year of time it will take.

            • Cicraft@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Well I guess you could sacrifice a portion for an index system and just decode the one you’re trying to read

    • DancingIsForbidden@lemmy.world
      link
      fedilink
      arrow-up
      14
      ·
      edit-2
      1 year ago

      Have a couple old pirated e-textbooks as .pdf files on my PC from uni, several hundred pages with color images, and they are mostly under 50MB, averaging about 30MB. 1GB is a little over a thousand MB (1024) so 1 would maybe hold a bit under 50 or so each? So times 64 that, a hell of a lot. Several thousand total, at least, as size varies.

    • H3‎@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      12
      ·
      1 year ago

      a shitload. 64000 if it were simple text only stuff with 1MB per book, 640 if it were 100MB chonkers full of images

      • Mog_fanatic@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        yeah i read mostly sci fi books so around like 300-400 pages all text and i’d say the average e-book for them is like 150-200kb’s so if it were books like that you’d be looking at stuffing like 300,000 books on there.