- cross-posted to:
- arstechnica_index@rss.ponder.cat
- cross-posted to:
- arstechnica_index@rss.ponder.cat
As a reminder, current estimates are that quantum cracking of a single 2048-bit RSA key would require a computer with 20 million qubits running in superposition for about eight hours. For context, quantum computers maxed out at 433 qubits in 2022 and 1,000 qubits last year. (A qubit is a basic unit of quantum computing, analogous to the binary bit in classical computing. Comparisons between qubits in true quantum systems and quantum annealers aren’t uniform.) So even when quantum computing matures sufficiently to break vulnerable algorithms, it could take decades or longer before the majority of keys are cracked.
The upshot of this latest episode is that while quantum computing will almost undoubtedly topple many of the most widely used forms of encryption used today, that calamitous event won’t happen anytime soon. It’s important that industries and researchers move swiftly to devise quantum-resistant algorithms and implement them widely. At the same time, people should take steps not to get steamrolled by the PQC hype train.
And everyone thinks about real time implications, what about historical ? Seems pretty likely that the NSA has been storing an appreciable fraction of the internet for a long damn while. Come Q-Day that all gets opened and searchable. What would Trump do ?
Man, quantum computers has been about-to-break-encryption since the 90s. The hype never ends, just a new crop of people first hear it then figure out it’s bullshit.
Not to mention we already have quantum-computer-resistant cryptography.
But isn’t the point that we just need to stay ahead of it. Surely encryption used in the 90s could be broken by a quantum computer today?
I do not know of any such occurrence. I would like to know about it
It seems the RSA-155 (512 bit) encryption commonly used in the 90s was broken in 1999, no quantum needed (due to it being based on primes).
Though from what I can search up, reddit users from 10 years ago were confident a 128 bit modern algorithm (e.g. AES) would never be able to be brute forced, even by quantum computers.
I dunno, sometimes I wonder if not everyone on the internet is an expert.
It’s like nuclear fusion, always just around the corner…
If qbits double every year, we’re at 20 million in 15 years. Changing crypto takes a very long time on some systems. If we’re at ~20000 in 5 years, we better have usable post quantum in place to start mitigations.
But I’m not convinced yet, we’ll have those numbers then. Especially error free qbits…
If qbits double every year
And then we need to increase coherence time, which is 50ms for the current 433 qubits large chip. Error correction might work, but might not
Error correction does fix that problem but at the cost of increasing the number of qubits needed by a factor of 10x to 100x or so.
But who guarantees that ec will overcome decoherence, introduced by this number of qbits? Not a trivial question that nobody can answer for certain
I don’t know where the 20 million comes from Estimates are 4000 qbits for RSA 2028.
1000 qubits? Where? Last time I checked it was 50 qubits for 200ms
Ok, I decided to dive into it today again and look what I’ve found:
-
They still demonstrate supremacy to each other proving that their setup couldn’t be simulated. These 433 and 1000 qubit processors are good only for one purpose: to simulate itself.
-
Photonic QC still estimates hafnian billions times faster; if only this mathematical structure appeared to have any practical meaning
-
They demonstrated that toric codes might be effective
-