Quantum and Post Quantum Flashcards

1
Q

Quantum mechanics and Young’s experiment

A

Quantum mechanics is an area of physics where classical deterministic rules (e.g. Schrodingeer, Heisenberg…) do not hold anymore.

Some examples are:
* Uncertainty principle: intrinsic limitation in microscopic mechanics, that limits the measurement precision of certain pairs of physical properteis (e.g. position and momentum). The more accurately one property of a particle is measured, the less accurately the other property can be known.
* Quantum superposition: is the ability of a quantum system to be in multiple states simultaneously, until it is measured. To better phrase it: “In quantum system all states are possible with different probabilities until you perform a measurement, forcing a state”.

The main quantum mechanisms properties needed to present quantum computing are: quantum superposi- tion, quantum entanglement and no-cloning theorem.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Probabilistic interpretation of QM

A

Given that in quantum mechanisms each state can be possible with specific probability, until measured:
* Particles are described by a wave function dependent on the position and time.
* The wave function is related to the probability of observing the particle at a given position at a given time.
* The square of the wave function equals the probability density P (x, t) of finding the particle at position x.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Young’s experiment

A

Light beam against a barrier with 2 vertical slits, normally we’d expect the light beams to pass either on the left or the right, whereas in this experiment interference was identified. Young’s double-slit experiment demonstrates light’s wave nature: coherent light passing through two slits creates an interference pattern of alternating bright and dark fringes. This interference arises from the superposition of wavefronts. In quantum physics, the same experiment with single photons, electrons, or other particles shows that even individual particles interfere with themselves, forming a pattern over time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Quantum entaglement

A

Quantum entaglement is the phenomenon of a group of particles (minimum two), that is said to be entangled, when the quantum state of each particle of the group cannot be described independently of the state of the others. In this sense:
* It is a global system state that persists even when the particles are separated by a large distance.
* Even if they are 2 different particles, that can’t be considered individually.

As an example of an entangle quantum system:
* A system of two entangled particles A and B has null total spin.
* Particles are separated (but they remain correlated, since they are entangled)
* A is measured and found to have clockwise spin. Keep in mind we can’t force the value of the spin (e.g. 0 or 1), but rather just measure it (performing a random measurement) which will force the spin to the measured state.
* Then if we measure B it will have counterclockwise spin, to counter A’s clockwise measured spin, in order to return to a null total spin.

An entangled system seems to achieve communication faster than speed of light. Meaning that once A is measured, immediately B when measured will present a state that respect the general relationship between the 2 entangled particles.

The problem is that we cannot use an entangled quantum system for transmitting information, as we cannot force the spin of A, meaning we can’t force a 0 or a 1 on A’s spin, but just randomly measure it.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

No-cloning theorem

A

The no-cloning theorem states that it is impossible to create an independent and identical copy of an arbitrary unknown quantum state.
Note “independent”, otherwise we could use entanglement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Quantum computing (QC): is it different from the classic one?

A

Yes, namely for the following properties:
* Quantum superposition: Quantum Computing uses qubits, which can exist in a superposition of 0 and 1 simultaneously, hence can explore multiple states at once, processing many cases in parallel. Particularly useful to different states simultaneously during computation.
* Quantum entanglement: permits Quantum Computing to coordinate and process complex calculations more efficiently.
* Quantum interference: Quantum Computing can exploit constructive and destructive interference to amplify the probabilities of correct answers while cancelling out incorrect ones, this boosts the efficiency of solving certain types of problems.

Therefore, Quantum Computing is NOT general-purpose computing. As of now we’ve seen the advantages coming from quantum computing, but there are other elements to consider:
* Quantum Computing works for specific classes of problems, where the quantum effects can be exploited (e.g. superposition). As of now, it is not applicable to all problems.
* Currently, the most successful area is optimization, with the Quantum Annealing (QA) running on a special-purpose QC, based on superconducting technology implementing adiabitic quantum computa- tion.
* Adiabatic quantum computation is just a possibility, there are many approached with different technologies that lead to installation constraints and manufacturing problems.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

The Quantum Problem for Cryptography

A

Current cryptography has an expiration date. The Global Risk Institute published in 2024 the “2024 Quantum Threat Timeline Report” about Cryptographically-Revelant Quantum Computer, drawing the conclusion that in the span of 15-30 years, a sufficiently powerful QC will be built, able to decrypt ANY encrypted data on the internet today.
Therefore, we should start to be worried and act TODAY as it takes time to develop, test, validate, implement and mass deploy new cryptographic techniques and the attackers could implement the HNDL strategy, i. e. HARVEST (encrypted data) NOW, DECRYPT LATER when dedicated QCs to decrypt are available. Of course, assuming that collected data are still useful in some years from now.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Security of current cryptography wrt quantum

A

The security of current cryptography is based on:
* Mathematically hard problems, so called NP-hard problems, such as discrete logarithm (of integers or elliptic curves) and factorization (of integers). Typically in asymmetric cryptography for both digital signature and key exchange.
* Impossible exhaustive search, assumption lies on the fact that it’s computationally unfeasible to perform all computations for an exhaustive search. Typically used in symmetric cryptography for both encryption and hashing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Shor’s algorithm

A

Shor’s algorithm was developed before Quantum Computers existed. Shor imaged that we could create Quantum Computers and developed this algorithm to find the prime factors of an integer number with a QC.

The properties of Shor’s algorithms are:
* Accelerates finding the “order”. As a recap, order finding problem is as such: given two numbers x and N, what is the smallest integer r > 1 (the order), such that: x^r mod N = 1?
* Polynomial complexity that goes as the cube of N’s logarithm ∼ O((log N)3).
* The algorithm is not deterministic, for each run we will have a success probability less than 1 (typically 0.5) of finding the solution, therefore, it should be repeated more than once to achieve success.
* If the quantum computer does not succum to quantum noise or decoherence (e.g. temperate delta, electromagnetic field variation, vibrations…), it can break RSA, DH, ECDSA, ECDH.

For now only special cases were solved, but up to now quantum computers have always performed better than expected (even better than Moore’s law), as there’s always push for new technology to solve certain kinds of problems. Therefore, it’s very difficult to make a forecast on when a quantum computer, in whatever technology will be available, will be able to effectively use Shor’s algorithm on large numbers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Shor’s algorithm complexity

A

Number of effective qubits needed to factor n-bits length keys:
– 2000 qubits to factor 1024 bits length keys.
– 4000 qubits to factor 2048 bits length keys.
– 8000 qubits to factor 4096 bits length keys.

This is the number of effective qubits (i.e. after error correction).
Typically, there’s 1 effective qubit each 10 qubits (QEC).

Shor’s algorithm requires a quantum circuit with a certain depth, which is the number of stages that
processes the information.
Circuit Depth with unit = 106:
– roughly 900 (106) as circuit depth for 1024 bits length key.
– roughly 1200 (
106) as circuit depth for 2048 bits length key.
– roughly 10000 (*106) as circuit depth for 4096 bits length key.

The connections between operators in a quantum computer are not flexible, meaning there’s a fixed combination designed and fixed connections at microscopic level by the manufacturer.
Note: this assumes an ideal circuit, but mapping to a real quantum computer would multiply this by at least one order of magnitude.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Quantum Error Correction (QEC)

A

Quantum Error Correction (QEC) explains why there’s a reduction factor of 10 when considering the effective number of qubits.
QEC is needed to eliminate the effect of noise on stored quantum information, faulty qugates, faulty quantum preparation, and faulty measurements.

Classic Error Correction uses reduncancy (i.e. copying the information). But this is impossible in quantum world for the no-cloning theorem.

Shor discovered a method for formulating a QEC code by storing the information of one qubit onto a highly state of nine qubits. In other words, in order to have 1 qubit effectively working with error correction, we will need 9 other qubits highly entabled, from this stems the reduction 10 factor.

The measurement on the QEC qubits provides information about the error happened but not the value of the affected qubit (otherwise it would destroy the quantum superposition with other qubits in the system). The possible erros are value or sign phase errors that can be reverted by using appropriate Pauli matrix operator.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Grover’s algorithm

A

Grover’s algorithm was developed in 1996 by Lov Kumar Grover. Given the (plaintext, ciphertext) pairs, it is able to find the key of a symmetric encryption algorithm, such as AES.

Grover’s algorithm is a quantum algorithm for unstructured search:
* Finds with high probability (50%) the unique input to a black box function that produces a specific output value (preimage).
* Uses O(√N) evaluations of the function, where N is the size of the function’s domain.
* Normal search is O(N) so there is a quadratic speed-up.

Therefore, Grover’s algorithm is used to accelerrate brute force attacks to symmetirc cyphers:
* 264 quantum operations needed to break AES-128.
* 2128 quantum operations needed to break AES-256.

Grover’s algorithm can also be used to find collisions in hash functions but RHO algorithm is better.

Pros
– Useful also for other applications (unstructured searches)
– A large specialized quantum computer that can recover AES-128 keys will likely be much smaller and easier to build than one which implements Shor’s algorithm for 256-bit ECs or 2048-bit RSA/DSA keys.

Cons
– Quantum operations overhead, which is much bigger than for traditional non-quantum operations. In this sense, the connection of all qubits and depth are tailored for this specific problem which is good but also bad at the same time, as it costs billions. Meaning that to perform Grover’s algorithm a specific QC computer with a precise connection and depth need to be designed, and this QC can only be used to solve this problem.
– Lack of parallelization in search, as the algorithm is based on serial operations.

A possible countermeasure to still protect our crypto systems from a QC designed to use this algorithm is
to double the key size(but no AES-512 exists) or double the hash length.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Brassard’s algorithm

A

Brassard’s algorithm solves the collision problem for a black-box r-to-one function F:
* r different inputs will generate the same output value
* find a collision, i.e. find x0 and x1 such that F (x0) = F (x1).

It is straightforward to notice that this algorithm is suitable to be used to attack against hash function, with better performances than the brithday attack.
Brassard’s algorithm uses Grover’s algorithm in a different way, trading-off space for speed. In this sense, the comparison Grover vs. Brassard for collision search is as such:
* Grover: time=O(N^1/2) and space=O(logN)
* Brassard: time = O((N/r)^1/3 ) and space = O(N^1/3 ) with r depending on the r-to-one function F parameter.

It is noticeable how Brassard is faster than
Grover, while having a higher memory complexity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Mosca’s rule

A

Given:
- Q: time to develop a quantum computer for Shor/Grover algorithms
- S: time data needed to remain secure
- U: time needed to update crypto systems from classic to quantum-resistant (i.e. post-quantum) solutions.

If: (U + S > Q) Then: “Houston, we have a problem!”

Only after the update is complete, the newly created daya will be post-quantum resistant. The risk indicates the interval of time when data can be copied and harvested, while there’s no QC solving it but the algorithms used are not yet quantum-resistant. Therefore, once Q is passed, the harvested data can be decrypted.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

SIKE

A

SIKE, which stands for Supersingular Isogeny Key Encapsulation, is a KEM algorithm proposed by Microsoft to the NIST Post-Quantum competition.

  • SIKE is based on the Supersingular Isogeny Diffie-Hellman (SIDH) key exchange protocol. (e.g. a vairant of HD on a specific curve).
  • Uses arithmetic operations on Elliptic Curve over finite fields and compute maps (isogenies) between such curves.
  • Its security is similar to the hardness of finding a specific isogeny between two Elliptic Curves, which is equivalent to find a path in the isogeny graph. Notice how it’s a different problem from the discrete logarithm on an EC (base ECDH).
  • SIKE was implemented by Microsoft and Cloudflare as a constant time library, to avoid time-based side-channel attacks. But it was attacked in another ingenious way, namely Hertzbleed.

Besides Hetzbleed attack, there are other attacks against the instrinc math of SIDH, so the authors decided to withdraw the proposal from the NIST PQ competition. But they did that with a final submission to keep a record of the problem, in order to describe their failure in detail avoiding other people repeating the same mistake.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

SIKE and the Hertzbleed attack

A

The Hertzbleed attack introduced a new family of side-channel attacks: freqency-based side-channel attacks.
* Hertzbleed exploits dynamic CPU frequency scaling, by observing the frequency of the CPU at given periods of time.
* In the worst case, it’s able to extract cryptographic keys on remote servers.
* all Intel CPUs and many AMD ones are affected.
* Being that it’s a Hardware related side-channel, there’s no possible patch.
* Rather, new guidelines for software developers and additional checks, with performance penalty, were introduced.

17
Q

NTRU

A

It offers both NTRUEncrypt and NTRUSign.

NTRU is:
* Immune to Shor’s algorithm.
* Provides fast private-key operations:
– RSA operations scale as the cube of the key size.
– NTRU operations scale as the square of the key size.
* In 2010, NTRU was only 20 times slower than AES-256 at equivalent strength.

18
Q

The FALCON problem

A

FALCON is a digital signature algorithm:
– Fast.
– Small key sizes.
– However, it requires floating-point operations (due to FFT), so subject to time-based side-channel
attacks caused by non-constant-time operations.

19
Q

NIST PQC security levels

A

In the past we used the security of cryptographic algorithms was bound to the key size.
Nowadays, NIST defines security levels instead of just key sizes, due to the complexity and the choice for the multiple parameters of quantum-resistant algorithms, which affects the security level:
* Level 1: Equivalent to AES-128 exhaustive key search attack, requiring attack complexity 264 (Grover’s algorithm).
* Level 2: Equivalent to SHA-256 collision search attack, requiring attack complexity 285 (Brassard’s algorithm).
* Level 3: Equivalent to AES-192 exhaustive key search attack, requiring attack complexity 296 (Grover’s algorithm).
* Level 4: Equivalent to SHA-384 collision search attack, requiring attack complexity 2128 (Brassard’s algorithm).
* Level 5: Equivalent to AES-256 exhaustive key search attack, requiring attack complexity 2128 (Grover’s algorithm).

20
Q

ML-DSA

A

Module-Lattice-Based Digital Signature Algorithm is a quantum-resistant digital signature.
The algorithms are named after the dimension of the A matrix used in computations, for example ML-DSA-44 uses a 4x4 matrix.

21
Q

ML-KEM

A

Module-Lattice-Based Key Encapsulation Mechanism is a quantum-resistant encapsulation of a symmetric key.
The algorithms are named after a specific parameter is set (NOT the key).
ML-KEM-768 is recommended as the default.

Notice how there are not anymore public and private keys, rather encapsulation and decapsulation keys,
hence why they are labelled as KEM (Key Encapsulation Mechanism) instead of public-key algorithms.

22
Q

SLH-DSA

A

Stateless Hash-Based Digital Signature Algorithm is quantum-resistant digital signature and it uses both:
* FORS (Forest of Random Subsets) for a few-time signature scheme.
* XMSS (eXtended Merkle Signature Scheme) for multi-time signatures.
This scheme limits the number of signatures per key but ensures strong security with extensive use of
cryptographic hash functions:
* SHA-256 and SHAKE-128: Suitable for Level 1.
* SHA-512 and SHAKE-256: Suitable for all levels.

23
Q

PQC Practical Problems

A

Practical challenges associated with post-quantum cryptography:
* Increased size of signatures.
* Larger certificate sizes (including certificate chains).
* Growth in network packet sizes.
* Limited hardware support.
* Necessity for cryptographic agility to handle future attacks => Redesign of interfaces to allow security policies to govern cryptographic operations.
* Compatibility with middleboxes (e.g., proxies, firewalls, IDS/IPS).
* Legacy system integration.
* Significant effort required for widespread adoption.

24
Q

Quantum Key Distribution (QKD)

A

QKD is often erroneously called “quantum cryptography.” Instead, it is used to generate a shared key via a physical channel (typically an optical fibre but other solutions are possible):
* Eavesdroppers are detectable as their measurements alter the quantum state.
* QKD requires a classical authenticated side channel for integrity, limiting its applicability, as pure QKD is meaningless.

Key limitations:
* Effective over distances of 10–100 km.
* Provides a key rate of only 0.1–1 Mbps.
* No true end-to-end security: Only supports hop-by-hop or site-to-site security.

A QKD protocol requires 5 parts:
* Encoding system
* Source (Alice)
* Quantum channel (data transmission) + Classical channel (Key sifting i.e. the process of determining which bits of the key are valid).
* Receiver (Bob)
* Post-processing: Error-correction and Privacy Amplification: assuming an attacker has gained knowledge of some bits of the key but not all of them, we use a hash function to obtain a shorter key that cannot be obtained by the attacker.

25
Q

BB84 Protocol

A

The BB84 protocol involves the following steps:
1. Alice selects random polarization states to send its bits.
2. Bob uses random polarization states to detect qubits.
3. Bob tells Alice the polarization states used.
4. Alice tells Bob for what bits they used the same polarization (those are the bits to keep). This is defined as key sifting.

Additional notes:
* Even after sifting, there may be errors that arise from imperfect quantum channels or third-party (Eve’s) interference.
* Error correction codes are used to resolve errors.
* Final output valid bits are reduced due to Quantum Bit Error Rate (QBER), which are much less than the transmitted bits.

26
Q

Photon Number Splitting (PNS) Attack

A

This attack can be performed by an active attacker (Eve). The active attacker exploits imperfections of devices used for QKD practical implementation allowing for different attacks.
The PNS attack exploits imperfections of the source (Alice) generating qubits in BB84. Eve counts the number of photons generated in each pulse of the source:
* For single-photon pulses (N = 1), Eve blocks the pulse.
* For multi-photon pulses (N > 1), Eve stores one photon and forwards others.
* Even uses the basis information disclosed between Alice and Bob, on the classical authenticated channel (which doesn’t provide confidentiality), to measure the stored photons via key sifting.

27
Q

Decoy State Protocol

A

A modified version of BB84 designed to mitigate PNS attacks, were different types of pulses are sent:
* Same wavelength duration and shape, different intensity.
* One type carries information, other are decoy pulses (not used).
* For each pulse, basis and the type are disclosed.
* Comparing the number of received decoy pulses to their sent number allows detecting PNS attack.

28
Q

Usage of keys obtained via QKD

A
  • As key for classic symmetric crypto (e.g. AES-256, HMAC-SHA-256)
  • As key for one-time pad encryption (the only perfect encryption algorithm)
  • NOTE: there’s no such thing as a quantum encryption algorithm. Only quantum key distribution.
29
Q

One-Time Pad Encryption

A

The One-Time Pad (OTP) is a perfect secret key encryption algorithm, but it requires the use of a one-time pad, which is:
* Single-use.
* A pre-shared key.
* At least as large as the plaintext to be encrypted.

Each plaintext data unit (bit, byte, or character) is encrypted by combining it with the corresponding unit of the pad using modular addition (i.e., XOR).

When combined with QKD, one-time pads can be:
* Distributed on-the-fly.
* Used from a key pool (though this poses dangers).

For the OTP to be secure, the key must:
* Be at least as long as the plaintext.
* Be random:
– Uniformly distributed across all possible keys.
– Independent of the plaintext.
– Generated from a truly chaotic source, such as a hardware RNG.
* Be patternless, following G. Chaitin’s definition: Passing statistical randomness tests is not sufficient.
* Contain a number of entropy bits at least equal to the number of plaintext bits: Prevents attackers from deducing the key by analyzing the RNG.
* Never be reused, in whole or in part.
* Be kept completely secret from unauthorized parties.

OTP encryption has practical challenges:
* Size of the one-time pad:
* Distribution and storage:
* Randomness: keys generated by pseudo-random number generators (PRNGs) can be predicted.
* Secure generation: vulnerable to real-time RAM/CPU attacks or forensic analysis.
* Secure exchange: Man-in-the-Middle (MITM) or malicious tampering (MATE) attacks are concerns. Here’s where QKD plays a role.
* Secure use: Attacks during real-time key usage are possible.
* Secure disposal: Key material must be erased without leaving forensic traces.

30
Q

Key Combiner

A

Combines multiple shared secrets into one:
* Formula: ss = keyCombiner(ss1, ss2, …, ssN ).
* Ensures resulting key remains secure even if one secret is compromised.
* Can combine secrets from different sources, where the shared secrets may be agreed, distributed, pre-shared or generated on-the-fly (e.g., Diffie-Hellman, KEM, QKD).
* The “X-Wing” algorithm is a promising key combiner for hybrid cryptography (in this case, QKD with other keys).