Join Keyfactor at RSA Conference™ 2024    |    May 6 – 9th    | Learn More

Approaches to Deploying Post-Quantum Cryptography

Crypto-Agility

With NIST’s Post-Quantum Cryptography (PQC) competition entering its last phase and the news about IBM building a 1000 qubit quantum computer by 2023, the discussion on how to deploy quantum-resistant algorithms has taken center stage.

During the 2020 Keyfactor Critical Trust Virtual Summit, Russ Housley, Founder and Owner of Virgil Security, and Massimiliano Pala, Principal Architect Security Services, R&D at CableLabs, debated on the strategies to be used for organizations to prepare for the quantum era. Check out the session here.

From Traditional to PQC Algorithms

According to Housley, the goal is to deploy post-quantum algorithms before there is a large-scale quantum computer able to break public-key algorithms in widespread use today. The working assumption is that while people gain confidence in the PQC algorithms and their implementations, security protocols will use a mix of traditional and PQC algorithms.

Organizations also need to recognize that upgrading existing PKI schemes will take a long time. In addition, existing security protocols need to be updated, which will also take a long time.

Approaches to Migrate to PQC Algorithms

Migrating from existing public-key encryption algorithms to PQC algorithms could take two possible approaches:

  • Two Certificates: Use of two certificates, each one with one public key and one signature. In this case, the first certificate uses a traditional algorithm, while the second one uses PQC algorithms for the public key and the signature.
  • One Certificate: Use of one certificate, which contains a sequence of traditional and PQC public keys and a sequence of traditional and PQC signatures.

In both cases, the security protocols should mix traditional and PQC algorithms for confidentiality and authentication based on both.

  • IPSec and TLS would use a Key Derivation Function (KDF) to compute shared secret from two inputs [SS = KDF(SST, SSPQC)]
  • S/MIME could do the same or use double encapsulation for confidentiality along with parallel signatures for authentication.

Each approach presents certain advantages and disadvantages.

Pros and Cons of Using One Certificate

In the case of using one certificate, the security protocols will not require any new fields, because the additional public keys are in one certificate. However, security protocols need to be updated to be compatible with the PQC algorithms. In addition, there would be no need to modify the certificate architecture.

However, the certificate path validation needs additional processes and complexity to handle new corner cases, such as what happens if a traditional signature fails, but the PQC one is good or vice versa. Is the certificate valid or not?

Another disadvantage is that the certificate becomes huge and bears the known pitfalls of “jumbo” certificates, which back in the 90s carried a key agreement public key and a signature public key for the same user. That is why today the US DoD uses three distinct certificates per user – one for smart card login, one for encryption key management, and one for signature.

Pros and Cons of Using Two Certificates

In the case of the two certificates approach, the security protocols would require new fields for the additional certificates. Despite that, there would be no need to modify the certificate architecture and certificate path validation continues to function as it does today.

Additionally, we will avoid the known pitfalls of the “jumbo” certificate, since the two certificates are slightly bigger than one, just because the subject, issuer and other metadata are repeated in both. Finally, when the transition period is over and all devices support PQC algorithms, we would simply stop using the certificates with the traditional algorithms.

Based on his analysis, Russ Housley recommended that we should follow the two certificates approach. His advice, however, is to begin the necessary preparation on the security protocols for mixing the two certificates now and start planning for the day when only PQC algorithms are used.

A Proposal for Transitional a Whole Industry

CableLabs has developed the Data Over Cable Service Interface Specification (DOCSIS) international telecommunications standard that permits the addition of high-bandwidth data transfer to an existing cable television (CATV) system. It is used by many cable television operators to provide Internet access (see cable Internet) over their existing hybrid fiber-coaxial (HFC) infrastructure.

Massimiliano Pala provided a brief overview of how DOCSIS enforces authentication and authorization, saying the broadband industry has leveraged PKI to deliver secure authentication and enable Baseline Privacy Interface Plus (BPI+) to encrypt customers’ traffic. The DOCSIS protocol leverages two different PKIs that use the RSA algorithms for public keys.

diagram

As the notion of Zero Trust networks is introduced to address the need for effective authentication and authorization of new entities in the networks, the next generation of architectures will rely on the trustworthiness of the PKI. On DOCSIS, the new version of BPI+ enables mutual authentication across devices and networks.

The Quantum Threat

Classic cryptography relies on problems that are hard to reverse. For example, the security of RSA algorithm assumes that factoring large numbers is a hard problem, while calculating signatures and validating them is a relatively fast operation.

Quantum computing has been the center of attention in terms of investments (Google and IBM) and results (reduced complexity of quantum computers, billions of qubits, quantum logic gates, etc.). Quantum computers can be extremely efficient in probing for patterns.

Using the Shor’s algorithm, a quantum computer needs N/2 steps to factor keys instead of the 2N/2 required today – where N is the number of bits in the key.

However, quantum computers are not as efficient when there is no structure to be probed. Breaking encryption and hashing algorithms is accelerated four times “only” via the Grover’s algorithm and therefore a 256-bit security addresses this use case.

In DOCSIS, both the legacy and the new PKIs use the RSA algorithm, which is one of those classes of problems that can be easily solved with quantum computers. As a result, all identities protected via the PKI will not be secure and they can be spoofed. Particularly cumbersome is the factorization of the higher levels of the PKI hierarchy, i.e. Root and Intermediate CAs.

On the encryption front, the DOCSIS devices currently support 128-bit encryption. This is also a problem because, under the quantum threat paradigm, 128-bit security is effectively reduced to only 64-bit, which can be efficiently broken.

quantum

The legacy infrastructure still uses hashing algorithms that do not provide 256-bit security. Those algorithms will not be secure, although this is less of a problem because the DOCSIS 3.0 infrastructure will expire in less than 20 years.

How to Address the Quantum Threat

To address the quantum threat, said Massimiliano Pala, we must look at how we can augment the PKI to support quantum-safe algorithms, without establishing separate infrastructures or provide multiple certificate support in the protocols. For the security of the ecosystem to be preserved, not only certificates but all PKI data structures must be protected against the quantum threat.

Together with multi-algorithm support, we need to investigate which algorithms can be used for post-quantum validation in devices where updates might not be possible and/or support for quantum-safe algorithms cannot be deployed to private key operations.

There is therefore a need for multiple validation patterns:

  • Post-quantum devices that can produce and validate quantum-safe authentications via post-quantum algorithms. Classic algorithms will be used for the validation of classic devices.
  • Classic devices will continue to use the same algorithm (RSA) and will be required to use symmetric Pre-Shared Keys (PSKs) to protect signatures and certificate chains.
  • Classic devices with support for Post-Quantum Algorithm (PQA) validation can directly validate the new certificate chain by using both the classic and the PQA but they can only produce classic authentication.

Learn More

Although the time when a quantum computer will actually pose a threat to existing cryptographic algorithms is difficult to predict, organizations must start planning ahead.

Download our eBook on Crypto-Agile PKI to learn more about how you can start planning today.