Quantum and Post Quantum Flashcards
Quantum mechanics and Young’s experiment
Quantum mechanics is an area of physics where classical deterministic rules (e.g. Schrodingeer, Heisenberg…) do not hold anymore.
Some examples are:
* Uncertainty principle: intrinsic limitation in microscopic mechanics, that limits the measurement precision of certain pairs of physical properteis (e.g. position and momentum). The more accurately one property of a particle is measured, the less accurately the other property can be known.
* Quantum superposition: is the ability of a quantum system to be in multiple states simultaneously, until it is measured. To better phrase it: “In quantum system all states are possible with different probabilities until you perform a measurement, forcing a state”.
The main quantum mechanisms properties needed to present quantum computing are: quantum superposi- tion, quantum entanglement and no-cloning theorem.
Probabilistic interpretation of QM
Given that in quantum mechanisms each state can be possible with specific probability, until measured:
* Particles are described by a wave function dependent on the position and time.
* The wave function is related to the probability of observing the particle at a given position at a given time.
* The square of the wave function equals the probability density P (x, t) of finding the particle at position x.
Young’s experiment
Light beam against a barrier with 2 vertical slits, normally we’d expect the light beams to pass either on the left or the right, whereas in this experiment interference was identified. Young’s double-slit experiment demonstrates light’s wave nature: coherent light passing through two slits creates an interference pattern of alternating bright and dark fringes. This interference arises from the superposition of wavefronts. In quantum physics, the same experiment with single photons, electrons, or other particles shows that even individual particles interfere with themselves, forming a pattern over time.
Quantum entaglement
Quantum entaglement is the phenomenon of a group of particles (minimum two), that is said to be entangled, when the quantum state of each particle of the group cannot be described independently of the state of the others. In this sense:
* It is a global system state that persists even when the particles are separated by a large distance.
* Even if they are 2 different particles, that can’t be considered individually.
As an example of an entangle quantum system:
* A system of two entangled particles A and B has null total spin.
* Particles are separated (but they remain correlated, since they are entangled)
* A is measured and found to have clockwise spin. Keep in mind we can’t force the value of the spin (e.g. 0 or 1), but rather just measure it (performing a random measurement) which will force the spin to the measured state.
* Then if we measure B it will have counterclockwise spin, to counter A’s clockwise measured spin, in order to return to a null total spin.
An entangled system seems to achieve communication faster than speed of light. Meaning that once A is measured, immediately B when measured will present a state that respect the general relationship between the 2 entangled particles.
The problem is that we cannot use an entangled quantum system for transmitting information, as we cannot force the spin of A, meaning we can’t force a 0 or a 1 on A’s spin, but just randomly measure it.
No-cloning theorem
The no-cloning theorem states that it is impossible to create an independent and identical copy of an arbitrary unknown quantum state.
Note “independent”, otherwise we could use entanglement.
Quantum computing (QC): is it different from the classic one?
Yes, namely for the following properties:
* Quantum superposition: Quantum Computing uses qubits, which can exist in a superposition of 0 and 1 simultaneously, hence can explore multiple states at once, processing many cases in parallel. Particularly useful to different states simultaneously during computation.
* Quantum entanglement: permits Quantum Computing to coordinate and process complex calculations more efficiently.
* Quantum interference: Quantum Computing can exploit constructive and destructive interference to amplify the probabilities of correct answers while cancelling out incorrect ones, this boosts the efficiency of solving certain types of problems.
Therefore, Quantum Computing is NOT general-purpose computing. As of now we’ve seen the advantages coming from quantum computing, but there are other elements to consider:
* Quantum Computing works for specific classes of problems, where the quantum effects can be exploited (e.g. superposition). As of now, it is not applicable to all problems.
* Currently, the most successful area is optimization, with the Quantum Annealing (QA) running on a special-purpose QC, based on superconducting technology implementing adiabitic quantum computa- tion.
* Adiabatic quantum computation is just a possibility, there are many approached with different technologies that lead to installation constraints and manufacturing problems.
The Quantum Problem for Cryptography
Current cryptography has an expiration date. The Global Risk Institute published in 2024 the “2024 Quantum Threat Timeline Report” about Cryptographically-Revelant Quantum Computer, drawing the conclusion that in the span of 15-30 years, a sufficiently powerful QC will be built, able to decrypt ANY encrypted data on the internet today.
Therefore, we should start to be worried and act TODAY as it takes time to develop, test, validate, implement and mass deploy new cryptographic techniques and the attackers could implement the HNDL strategy, i. e. HARVEST (encrypted data) NOW, DECRYPT LATER when dedicated QCs to decrypt are available. Of course, assuming that collected data are still useful in some years from now.
Security of current cryptography wrt quantum
The security of current cryptography is based on:
* Mathematically hard problems, so called NP-hard problems, such as discrete logarithm (of integers or elliptic curves) and factorization (of integers). Typically in asymmetric cryptography for both digital signature and key exchange.
* Impossible exhaustive search, assumption lies on the fact that it’s computationally unfeasible to perform all computations for an exhaustive search. Typically used in symmetric cryptography for both encryption and hashing.
Shor’s algorithm
Shor’s algorithm was developed before Quantum Computers existed. Shor imaged that we could create Quantum Computers and developed this algorithm to find the prime factors of an integer number with a QC.
The properties of Shor’s algorithms are:
* Accelerates finding the “order”. As a recap, order finding problem is as such: given two numbers x and N, what is the smallest integer r > 1 (the order), such that: x^r mod N = 1?
* Polynomial complexity that goes as the cube of N’s logarithm ∼ O((log N)3).
* The algorithm is not deterministic, for each run we will have a success probability less than 1 (typically 0.5) of finding the solution, therefore, it should be repeated more than once to achieve success.
* If the quantum computer does not succum to quantum noise or decoherence (e.g. temperate delta, electromagnetic field variation, vibrations…), it can break RSA, DH, ECDSA, ECDH.
For now only special cases were solved, but up to now quantum computers have always performed better than expected (even better than Moore’s law), as there’s always push for new technology to solve certain kinds of problems. Therefore, it’s very difficult to make a forecast on when a quantum computer, in whatever technology will be available, will be able to effectively use Shor’s algorithm on large numbers.
Shor’s algorithm complexity
Number of effective qubits needed to factor n-bits length keys:
– 2000 qubits to factor 1024 bits length keys.
– 4000 qubits to factor 2048 bits length keys.
– 8000 qubits to factor 4096 bits length keys.
This is the number of effective qubits (i.e. after error correction).
Typically, there’s 1 effective qubit each 10 qubits (QEC).
Shor’s algorithm requires a quantum circuit with a certain depth, which is the number of stages that
processes the information.
Circuit Depth with unit = 106:
– roughly 900 (106) as circuit depth for 1024 bits length key.
– roughly 1200 (106) as circuit depth for 2048 bits length key.
– roughly 10000 (*106) as circuit depth for 4096 bits length key.
The connections between operators in a quantum computer are not flexible, meaning there’s a fixed combination designed and fixed connections at microscopic level by the manufacturer.
Note: this assumes an ideal circuit, but mapping to a real quantum computer would multiply this by at least one order of magnitude.
Quantum Error Correction (QEC)
Quantum Error Correction (QEC) explains why there’s a reduction factor of 10 when considering the effective number of qubits.
QEC is needed to eliminate the effect of noise on stored quantum information, faulty qugates, faulty quantum preparation, and faulty measurements.
Classic Error Correction uses reduncancy (i.e. copying the information). But this is impossible in quantum world for the no-cloning theorem.
Shor discovered a method for formulating a QEC code by storing the information of one qubit onto a highly state of nine qubits. In other words, in order to have 1 qubit effectively working with error correction, we will need 9 other qubits highly entabled, from this stems the reduction 10 factor.
The measurement on the QEC qubits provides information about the error happened but not the value of the affected qubit (otherwise it would destroy the quantum superposition with other qubits in the system). The possible erros are value or sign phase errors that can be reverted by using appropriate Pauli matrix operator.
Grover’s algorithm
Grover’s algorithm was developed in 1996 by Lov Kumar Grover. Given the (plaintext, ciphertext) pairs, it is able to find the key of a symmetric encryption algorithm, such as AES.
Grover’s algorithm is a quantum algorithm for unstructured search:
* Finds with high probability (50%) the unique input to a black box function that produces a specific output value (preimage).
* Uses O(√N) evaluations of the function, where N is the size of the function’s domain.
* Normal search is O(N) so there is a quadratic speed-up.
Therefore, Grover’s algorithm is used to accelerrate brute force attacks to symmetirc cyphers:
* 264 quantum operations needed to break AES-128.
* 2128 quantum operations needed to break AES-256.
Grover’s algorithm can also be used to find collisions in hash functions but RHO algorithm is better.
Pros
– Useful also for other applications (unstructured searches)
– A large specialized quantum computer that can recover AES-128 keys will likely be much smaller and easier to build than one which implements Shor’s algorithm for 256-bit ECs or 2048-bit RSA/DSA keys.
Cons
– Quantum operations overhead, which is much bigger than for traditional non-quantum operations. In this sense, the connection of all qubits and depth are tailored for this specific problem which is good but also bad at the same time, as it costs billions. Meaning that to perform Grover’s algorithm a specific QC computer with a precise connection and depth need to be designed, and this QC can only be used to solve this problem.
– Lack of parallelization in search, as the algorithm is based on serial operations.
A possible countermeasure to still protect our crypto systems from a QC designed to use this algorithm is
to double the key size(but no AES-512 exists) or double the hash length.
Brassard’s algorithm
Brassard’s algorithm solves the collision problem for a black-box r-to-one function F:
* r different inputs will generate the same output value
* find a collision, i.e. find x0 and x1 such that F (x0) = F (x1).
It is straightforward to notice that this algorithm is suitable to be used to attack against hash function, with better performances than the brithday attack.
Brassard’s algorithm uses Grover’s algorithm in a different way, trading-off space for speed. In this sense, the comparison Grover vs. Brassard for collision search is as such:
* Grover: time=O(N^1/2) and space=O(logN)
* Brassard: time = O((N/r)^1/3 ) and space = O(N^1/3 ) with r depending on the r-to-one function F parameter.
It is noticeable how Brassard is faster than
Grover, while having a higher memory complexity.
Mosca’s rule
Given:
- Q: time to develop a quantum computer for Shor/Grover algorithms
- S: time data needed to remain secure
- U: time needed to update crypto systems from classic to quantum-resistant (i.e. post-quantum) solutions.
If: (U + S > Q) Then: “Houston, we have a problem!”
Only after the update is complete, the newly created daya will be post-quantum resistant. The risk indicates the interval of time when data can be copied and harvested, while there’s no QC solving it but the algorithms used are not yet quantum-resistant. Therefore, once Q is passed, the harvested data can be decrypted.
SIKE
SIKE, which stands for Supersingular Isogeny Key Encapsulation, is a KEM algorithm proposed by Microsoft to the NIST Post-Quantum competition.
- SIKE is based on the Supersingular Isogeny Diffie-Hellman (SIDH) key exchange protocol. (e.g. a vairant of HD on a specific curve).
- Uses arithmetic operations on Elliptic Curve over finite fields and compute maps (isogenies) between such curves.
- Its security is similar to the hardness of finding a specific isogeny between two Elliptic Curves, which is equivalent to find a path in the isogeny graph. Notice how it’s a different problem from the discrete logarithm on an EC (base ECDH).
- SIKE was implemented by Microsoft and Cloudflare as a constant time library, to avoid time-based side-channel attacks. But it was attacked in another ingenious way, namely Hertzbleed.
Besides Hetzbleed attack, there are other attacks against the instrinc math of SIDH, so the authors decided to withdraw the proposal from the NIST PQ competition. But they did that with a final submission to keep a record of the problem, in order to describe their failure in detail avoiding other people repeating the same mistake.