• Mikina@programming.dev
    link
    fedilink
    English
    arrow-up
    61
    ·
    2 days ago

    I mean, that’s literally how research works. You make small discoveries and use them to move forward.

    • over_clox@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      101
      ·
      2 days ago

      What’s to research? A fucking abacus can hold data longer than a goddamn hour.

      • Deceptichum@quokk.au
        link
        fedilink
        English
        arrow-up
        52
        arrow-down
        1
        ·
        2 days ago

        Are you really comparing a fucking abacus to quantum mechanics and computing?

      • Zement@feddit.nl
        link
        fedilink
        English
        arrow-up
        34
        arrow-down
        1
        ·
        2 days ago

        Are you aware that RAM in your Computing devices looses information if you read the bit?

        Why don’t you switch from smartphone to abacus and dwell in the anti science reality of medieval times?

        • FiskFisk33@startrek.website
          link
          fedilink
          English
          arrow-up
          8
          ·
          2 days ago

          And that it looses data after merely a few milliseconds if left alone, that to account for that, DDR5 reads and rewrites unused data every 32ms.

        • over_clox@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          17
          ·
          2 days ago

          You’re describing how ancient magnetic core memory works, that’s not how modern DRAM (Dynamic RAM) works. DRAM uses a constant pulsing refresh cycle to recharge the micro capacitors of each cell.

          And on top of that, SRAM (Static RAM) doesn’t even need the refresh circuitry, it just works and holds it’s data as long as it remains powered. It only takes 2 discreet transistors, 2 resistors, 2 buttons and 2 LEDs to demonstrate this on a simple breadboard.

          I’m taking a wild guess that you’ve never built any circuits yourself.

          • Zement@feddit.nl
            link
            fedilink
            English
            arrow-up
            11
            ·
            edit-2
            2 days ago

            I’m taking a wild guess that you completely ignored the subject of the thread to start an electronics engineering pissing contest?

            • over_clox@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              13
              ·
              2 days ago

              Do you really trust the results of any computing system, no matter how it’s designed, when it has pathetic memory integrity compared to ancient technology?

          • AbidanYre@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            2 days ago

            And you would have been there shitting on magnetic core memory when it came out. But without that we wouldn’t have the more advanced successors we have now.

              • AbidanYre@lemmy.world
                link
                fedilink
                English
                arrow-up
                5
                ·
                2 days ago

                Doubt.

                Core memory loses information on read and DRAM is only good while power is applied. Your street dime will be readable practically forever and your abacus is stable until someone kicks it over.

                You’re not the arbiter of what technology is “good enough” to warrant spending money on.

                • over_clox@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  4
                  ·
                  1 day ago

                  Core memory is also designed to accomodate for that and almost instantly rewrite the data back to memory. That in itself might be a crude form of ‘error’ correction, but it still lasts way longer than an hour.

                  Granted that quantum computers are a different beast of their own, how much digital data does a qbit actually store? And how does that stack up in price per bit comparison?

                  If they already know quantum computers are more prone to memory errors, why not just use reliable conventional RAM to store the intermediate data and just let the quantum side of things just be the ‘CPU’, or QPU if you like?

                  I dunno, it just makes absolutely no sense to me to utilitze any sort of memory technology that even with error correction still manages to lose information faster than a jumping spider’s memory?

        • frezik@midwest.social
          link
          fedilink
          English
          arrow-up
          23
          ·
          2 days ago

          Must be the dumbest take on QC I’ve seen yet. You expect a lot of people to focus on how it’ll break crypto. There’s a great deal of nuance around that and people should probably shut up about it. But “dime stuck in the road is a stable datapoint” sounds like a late 19th century op-ed about how airplanes are impossible.