Quantum Proofing Next Generation PKI and Digital Certificates

Related Content

Want to keep learning?

Subscribe to SSL.com’s newsletter, stay informed and secure.

Sensational articles like this about quantum computers even in 2016, create uncertainties for data security in the case that quantum computers of sufficient power are constructed. This article will attempt to shed some light on the situation. 

What is Quantum Computing?

Quantum computing is the application of quantum mechanics principles to perform computations. Specifically, quantum computing exploits quantum states of sub-atomic particles like superposition and entanglement to create quantum computers. When applied to quantum computers with sufficient power, specific algorithms can perform computations much faster than classical computers and even solve problems out of reach of the current computing technology. As a result, there is an increased interest from governments and industries worldwide to develop quantum computers. The field is still in its infancy, but the development is gaining traction, and there are already working quantum computers, albeit very weak at this point. 

SSL.com provides a wide variety of SSL/TLS server certificates for HTTPS websites.

COMPARE SSL/TLS CERTIFICATES

Classical and Quantum Computing

Classical computing uses bits, which express the physical phenomenon of electric current passing through circuits as ones and zeros. By manipulation of these ones and zeros, a computer can express complicated problems and solve them. 
 
Quantum computers, on the other hand, use quantum bits or qubits as the basis of computing. Qubits are two-state quantum-mechanical systems. Examples include the spin of an electron or the polarization of a single photon. Using qubits, we can exploit matter’s peculiar quantum mechanics states like entanglement and superposition to perform computations. 
 
When a qubit is superpositioned, it is neither a one or zero but a possibility of both. So, one qubit can represent two states simultaneously. Add another qubit, and you can represent four possibilities simultaneously; by adding more qubits, the number of possibilities that can be expressed rises fast. In general, this is two in the power of the number of qubits (2nfor n qubits). For example, a quantum computer with ten cubits can simultaneously represent 1024 bits, while the corresponding classical number is 10 bits. 
 
Entanglement is a quantum quality of sub-atomic particles that is not easy to explain. We do not have a clear scientific explanation about the underlying mechanism of entanglement. But as far as quantum computing is concerned, entanglement allows qubits to correlate with one another instead of acting randomly. 
 
The combined exploitation of superposition and entanglement allows us to create vast computational spaces with multiple dimensions, thus executing computations in parallel rather than in sequence. 
 
Quantum computing can solve some complex problems that classical computing can not because of the required memory. For example, quantum computing could allow the accurate mathematical representation of molecular interactions in a chemical reaction, promising significant progress in various scientific and technological sectors. Also, it will enable solving problems in fractions of time that classical computing can perform, including those that form the core of the current cryptography schemes.

How can quantum computing influence cryptography?

As discussed above, cryptography is based on the existence of intractable mathematical problems, not meaning that they are unsolvable, but that the time and resources required to reverse them make them practically safe. 
 
Quantum computing changes this ecosystem by minimizing the time needed for solving such problems by applying specific algorithms. 
 
For example, the algorithm discovered by Shor in 1994. If Shor’s algorithm is applied on a sufficiently powerful quantum computer, it can solve the integer factorization problem almost exponentially faster than the most efficient classical computing algorithm. The integer factorization problem is the base of the widely popular RSA public-key encryption scheme. As stated in the Report on Post-Quantum Cryptography by NIST:
 

“In 1994, Peter Shor of Bell Laboratories showed that quantum computers, a new technology leveraging the physical properties of matter and energy to perform calculations can efficiently solve each of these problems, thereby rendering all public-key cryptosystems based on such assumptions impotent. Thus a sufficiently powerful quantum computer will put many forms of modern communication—from key exchange to encryption to digital authentication—in peril.”

In short, a quantum computer of sufficient power could outright crash the Public Key Infrastructure, creating the need for a redesign of the whole cybersecurity ecosystem. 
 
But this is not all. Another algorithm, this one by Grover, can pose a threat to symmetric cryptography, although not so severe as Shor’s. When applied to a sufficiently powerful quantum computer, Grover’s algorithm allows for cracking symmetric keys at quadruple speed compared to classical computing. A significant improvement that is countered by using larger keys and maintaining the current security level. 

Will quantum computing come soon?

 
Physics has proven that quantum computing is feasible. Now, it is a problem of engineering, albeit a very difficult one. The construction of quantum computers involves the implementation of state-of-the-art technology like, among other things, superfluids and superconductors. The challenge of creating a stable and scalable quantum-mechanical system is immense, and it leads teams all over the world to pursue different paths. There are several types of quantum computers, including the quantum circuit model, quantum Turing machine, adiabatic quantum computer, one-way quantum computer, and various quantum cellular automata. The most widely used is the quantum circuit. 
 
A significant issue with any model of quantum computers is that by their nature, qubits lose their superposition status once measured and, consequently, are very sensitive to outside interference. Hence, it is challenging for qubits to maintain their quantum states. Some solutions include the use of ion traps, but total elimination of external interference is probably unachievable. As a result, one of the most crucial issues for creating quantum computers is a robust error correction mechanism. 
The big picture is that a breakthrough could be happening right now, or it could take a few years until a working prototype of sufficient computational power is created. There are already a few prototypes, with IBM Q System One being the most famous, but their computational power is still too small to be an issue for cryptographic systems. By no means, of course, is the cybersecurity community allowed to relax. Even if we had an efficient post-quantum security scheme, migrating the whole ecosystem to this new standard is a huge task. Consequently, several efforts are undergoing to be ready for the post-quantum era. 

What can we do?

When widespread quantum computing technology arrives, we will need to be ready with a quantum-resistant PKI. There are many undergoing projects towards this goal and many proposed technologies that could provide a solution. Below, we will try to summarize the most promising technologies and give a brief review of the collective projects underway to establish post-quantum cryptography, along with the challenges that lay ahead. 

Families of post-quantum algorithms

Research in the last 15-20 years has proven the existence of algorithms resistant to quantum attacks. Below we provide a brief description of the most promising algorithm families that could provide a solution for security in a post-quantum world. 

Code-based Cryptography

Code-based cryptography uses error-correcting codes to build public-key cryptography. It was first proposed by Robert McEliece in 1978 and is one of the oldest and most researched asymmetric encryption algorithms. A signature scheme can be constructed based on the Niederreiter scheme, the dual variant of the McEliece scheme. The McEliece cryptosystem has resisted cryptanalysis so far. The main problem with the original system is the large private and public key size.

Hash-based cryptography

Hash-based cryptography represents a promising post-quantum cryptography approach for digital signatures. Hash functions are functions that map strings of arbitrary length to strings of fixed length. They are one of the older public-key cryptography schemes, and their security assessments against classical and quantum-based attacks are well understood. Hash functions are already one of the most widely used cryptographic tools. It was known that they could be used as the sole tool for building public-key cryptography for a long time. In addition, hash-based cryptography is flexible and can meet different performance expectations. On the downside, hash-based signature schemes are mainly stateful, meaning that the private key needs to be updated after every use; otherwise, security is not guaranteed. There are hash-based schemes that are stateless, but they come at the cost of longer signatures, more significant processing times, and the signer’s need to keep track of some information, like how many times a key was used to create a signature.

Lattice-based cryptography

Lattice-based cryptography is a particular case of the sub-set sum problem-based cryptography and was first introduced in 1996 by Ajtai. It is the generic term for cryptographic primitives constructed with the use of lattices. Some of these constructions appear to be resistant to both quantum and classical computers attacks. Additionally, they have other attractive features, like worst-case hardness difficulty. Also, they present simplicity and parallelism and are versatile enough to construct robust cryptographic schemes. Finally, they are the only family of algorithms containing all three kinds of primitives required to build a post-quantum Public Key Infrastructure: public-key encryption, key exchange, and digital signature.

Multivariate cryptography

Multivariate cryptography refers to public-key cryptography whose public keys represent a multivariate and nonlinear (usually quadratic) polynomial map. Solving these systems is proven to be NP-complete, thus making this family of algorithms good candidates for post-quantum cryptography. Currently, multi-variate encryption schemes have proven less efficient than other schemes since they require substantial public keys and long decryption times. On the other hand, they turned out to be more suitable for building signature schemes, as they provide the shortest signature sizes among post-quantum algorithms, although they incur rather large public keys.

Isogeny-based cryptography

Isogeny-based cryptography uses maps between elliptic curves to build public-key cryptography. The algorithm that is a candidate for post-quantum cryptography is the Supersingular isogeny Diffie-Hellman key exchange (SIDH) introduced in 2011, making this scheme the most recent among the candidates. SIDH requires one of the smallest keys among the proposed key exchange schemes and supports perfect forward secrecy. However, its relatively young age means that there are not many schemes based on this concept, and there has not been much for inspecting their possible vulnerabilities. 

Projects for post-quantum cryptography

There are various working groups for post-quantum cryptography schemes, like the Open Quantum Safe (OQS) project and ENISA. Still, the most coherent initiative is the NIST Post-Quantum Cryptography Standardization Project that has been underway since 2017. As the name implies, the project aims at choosing a suitable cryptography scheme that will be the industry standard in the post-quantum era. The process began with 69 candidate algorithms, of which 26 advanced to the second round of evaluation. In July 2020, round 3 candidates were announced, as shown in the table below. There are seven finalists and eight alternative candidates overall. On the table is noted if they are considered for encryption or signature schemes, the algorithm family and the hard problem upon which they are based.

SchemeEnc/SIgFamilyHard Problem
Round 3 Finalists
Classic McElieceEncCode-BasedDecoding random binary Goppa codes
Crytals-KyberEncLattice-BasedCyclotomic Module-LWE
NTRUEncLattice-BasedCyclotomic NTRU Problem
SaberEncLattice-BasedCyclotomic Module-LWR
Crystals-DilithiumSigLattice-BasedCyclotomic Module-LWE and Module-SIS
FalconSigLattice-BasedCyclotomic Ring-SIS
RainbowSigMultivariate-BasedOil-and-Vinegar Trapdoor
Round 3 Alternate Candidates
BIKEEncCode-BasedDecoding quasi-cyclic codes
HQCEncCode-BasedCoding variant of Ring-LWE
Frodo-KEMEncLattice-BasedLWE
NTRU-PrimeEncLattice-BasedNon-cyclotomic NTRU Problem or Ring-LWE
SIKEEncIsogeny-BasedIsogeny problem with extra points
GeMSSSigMultivariate-Based‘Big-Field’ trapdoor
PicnicSigSymmetric CryptoPreimage resistance of a block cipher
SPHINCS+SigHash-BasedPreimage resistance of a hash function

The algorithm evaluation was based on the three criteria shown below.

  • Security: This is the most crucial criterion. NIST has established several factors to consider for evaluating the security provided by each candidate algorithm. Apart from the quantum resistance of algorithms, NIST has also defined additional security parameters that are not part of the current cybersecurity ecosystem. These are perfect forward secrecy, resistance to side-channel attacks, and resistance to multi-key attacks. 
  • Cost and performance: The algorithms are evaluated based upon their performance metrics like key sizes, the computational efficiency of public and private key operations and generation, and decryption failures.
  • Algorithm and implementation characteristics: Assuming the algorithms provide good overall security and performance, they are evaluated based on their flexibility, simplicity, and adoption ease (like the existence or not of intellectual property covering the algorithm).

Cryptographic agility 

 
An important paradigm in designing information security protocols is cryptographic agility. It dictates that protocols should support multiple cryptographic primitives, allowing the systems implementing a particular standard to choose which combinations of primitives are suitable. The primary goal of cryptographic agility is to allow the rapid adaptations of vulnerable cryptographic primitives and algorithms with robust ones without making disruptive changes to the system’s infrastructure. This paradigm proves to be crucial in the post-quantum cryptography design and requires at least partial automation. For example, an average enterprise holds upward of hundreds of thousands of certificates and keys — and that number continues to grow. With so many certificates, organizations must deploy automated methods to quickly replace these certificates if the cryptography they rely on becomes insecure.
 
An excellent first measure for organizations is to start implementing hybrid cryptography, in which quantum-safe public-key algorithms are used alongside traditional public-key algorithms (like RSA or elliptic curves) so that the solution is at least no less secure than existing traditional cryptography.

Conclusion

 
Advancements in technology frequently happen, especially in a field like computing. Quantum computing will upset the cybersecurity field, but the industry is already seeking and discussing solutions. It will mainly be an issue of logistics and preparedness when the time comes for organizations to adapt to the new reality and take measures.
 
 
Users can sign code with eSigner’s Extended Validation Code Signing capability. Click below for more info.

LEARN MORE

 

We’d love your feedback

Take our survey and let us know your thoughts on your recent purchase.