Introducing the 2024 PKI & Digital Trust Report     | Download the Report

  • Inicio
  • Blog
  • PQC
  • Top Takeaways from NIST’s Fifth PQC Standardization Conference

Top Takeaways from NIST’s Fifth PQC Standardization Conference

PQC

As we move closer to a post-quantum world, all eyes are on NIST (The National Institute of Standards and Technology).

NIST is the organization responsible for standardizing an algorithm specifically designed to be resilient against quantum computer attacks. The multi-year selection process is nearing its final phase, with three algorithms (BIKE, Classic McEliece, and HQC) passing a rigorous testing process in 2023 and making it to the fourth round.

Now, these algorithms are one step closer to being accessible to organizations worldwide for integration into their encryption infrastructure. But they’re not there yet.

The latest step in the journey to standardization was NIST’s Fifth PQC Standardization Conference, held this April in Rockville, Maryland. 

The conference’s purpose was to discuss various aspects of the algorithms and obtain valuable feedback that will inform decisions on standardization. The submission teams for BIKE, Classic McEliece, Falcon, and HQC also provided an update on their algorithms.

I had the opportunity to attend the conference, and here are my top takeaways.

But first, why it matters…

Before digging into the specifics of the PQC Standardization Conference, it’s important to be clear about just how important this process is.

Quantum cryptography is coming, and while no one knows exactly when it will arrive, we know that once the first quantum computers are live, most of our existing encryption algorithms will become obsolete.

So, while our current algorithms can still protect against non-quantum threats, experts emphasize the need for proactive planning to ensure long-term security. This is particularly important as integrating new algorithms across all computer systems can be time-consuming and span several years.

As a result, industry and government organizations are racing against PQC advancement to implement new standards. The algorithm(s) selected by NIST will deliver those standards, providing the world with the first tools to protect sensitive information from this new threat.

A new phase means new challenges

NIST announced that it will release the Federal Information Processing Standards (FIPS) 203, 204, and 205 around the summer of 2024. This announcement prompted each of the Round 4 candidates to share updates on how they plan to move forward. Several candidates also addressed recent challenges discovered with their algorithms. Here are the top-line messages from each:

BIKE
The BIKE team addressed a paper attacking their algorithm, claiming the attack was not practical in practice, and proposed a change in the implementation to fix it. 

Classic McEliece
The Classic McEliece team stays strong on their algorithm. They claim it’s the most stable of all the other PQC algorithms, citing several projects that have used it, including Adva in high-speed optical networks, crypt4a in HSM, software implementations like Bouncy Castle, and VPNs like MULLVAD VPN and ROSENPASS. 

Falcon
The Falcon team (along with NIST) will release an initial draft sometime in the fall of 2024. They mentioned the difficulty of implementing floating-point arithmetics and the problems it could uncover, sharing how they plan to adopt features from the Hawk algorithm, which uses ANTRAG to reduce the amount of floating-point arithmetic operations.

HQC
The HQC team released new security proofs that have no effect on the current state of the implementation or its inputs and outputs. They also addressed a countermeasure for a recent timing attack and a fix for their diversion modular timing side-channel attack. 

New FIPS bring about additional changes

NIST made several announcements about changes to expect with the new FIPS documents. The FIPS 203 for Kyber/ML-KEM will undergo some significant changes. These changes include reverting the A-matrix indexing, which will make it no longer backward-compatible and change the standardized test vectors. 

Next, Sample NTT will now use a specific XOF API because the existing standard did not allow SHAKE as a stream. They will also specify a “lower-level” derandomized API, which enables CAVP testing, and will not give guidance to KEMs in FIPS 203, but will give guidance in SP 800-277 instead. 

The FIPS 204 Dilithium/ML-DSA will also have some big changes. The SampleInBall function will change, which will cause backward-compatibility problems and will require new test vectors. Meanwhile, the ExpandMask will not use an offset for the SHAKE output, missing checks have been fixed, and domain-separated pure and pre-hash variants were introduced. 

Finally, the FIPS 205 will not have any significant changes. The biggest change will be for the internal functions to take randomness as input to enable CAVP testing.  

Side-channel attacks are top of mind

Side-channels came up several times, with many conversations about how to avoid these attacks.

One speaker talked about how they made Sphincs+ side-channel resistant, but it reduced the performance of the reference implementation by 70%. Meanwhile, another speaker shared their experience combining several weak physical leaks of HQC to recover the shared key using SASCA; however, this was only applicable to the reference implementation of HQC and not the optimized version. 

Notably, one session on the topic used neural networks to attack Dilithium’s key unpack function and recover the secret key. Dilithium responded that they consider the unpacking function to be protected and suggested statically masking the stored key, doing constant weight encoding, or shuffling the key unpacking loop to avoid vulnerabilities.  

Planning for the transition to PQC

While standards have yet to be finalized, organizations everywhere need to keep the transition to PQC top of mind. As a result, this transition prep was an important topic during the conference.

Key activities for the transition include identifying all PKI objects and their lifetime in the system, preparing data formats and technological constraints, and implementing and utilizing available open-source tools.

Some common post-quantum engineering obstacles organizations are already encountering include algorithm identifiers, object encoding, interoperability awareness, and cryptographic tokens. Specifically, the ASN.1 OIDs are very messy and need to be more unified. Fortunately, some efforts are underway to help, like the IETF Hackathon GitHub page, which has a table of current OIDs.

Understanding the impact of PQC on real-world connections

One question that came up was about the effect PQC will have on real-world connections like TLS 1.3.

The short answer is that PQC algorithms slow down the handshake process for TLS. That said, the more data a webpage needs, the less impact it has on the handshake slowdown. As a result, looking only at the handshake speed is not a viable way to measure efficiency since the data sent is important. 

Digital signatures remain an important point of interest

There was a lot of conversation about the methods used in the new signature round, particularly how GGM trees were optimized to be useful in signature algorithms. 

Specifically, ANTRAG simplifies and improves Falcon without compromising security by using a hybrid sampler rather than a kgpy sampler. As a result, using ANTRAG makes for a much simpler signature scheme with improved performance and no security loss – which is why the Falcon team is interested in adapting this approach into their algorithm. However, it does not eliminate the floating-point problem with which NIST is currently contending (but it does reduce it). 

There’s also some research exploring Sphincs+ parameter sets for specific use cases. For use cases like firmware signing, it’s redundant to have such a high max signing capacity, and lowering this limit helps reduce the size of the signature by 50% and improves verification by 80%. The risk of this approach is that the tracking signature count turns it into something stateful, in which case there are better options than Sphincs+. Additionally, having low usage limits has been problematic in the past, like in the case AES-GCM. 

Finally, there were several talks about hardware focused on the new signature algorithms and the theory behind these algorithms. One talk mentioned a new design for BIKE called lean BIKE, which trades off the CCA security for better performance and simpler implementation. There were also discussions about whether multi-recipient KEMs could provide state synchronization in HSM fleets and make MLS treeKEM more efficient. 

Can we speed up hashing?

One talk discussed accelerating SLH-DSA by two orders of magnitude, but only for hardware. Unfortunately, software speed-ups are difficult since a lot of the bottleneck stems from hashing and hash accelerators are designed to hash data, not other hashes, so it’s hard to speed up this process.

What does this all mean? PQC is fast approaching and the time to prepare is now

NIST’s Fifth PQC Standardization Conference was jam-packed with forward-looking conversations about what to expect in a post-quantum world, changes happening today, and challenges to consider. Perhaps the biggest takeaway is that PQC will be here before we know it, and there’s still a lot of preparation that needs to be done.

To that end, NIST will continue working toward standardizing the final algorithms and making additional changes to maintain security in a post-quantum world. At the same time, every organization must start preparing for the algorithm shift and any other changes NIST may recommend as we approach PQC.

Need help getting started? Keyfactor’s team of PKI experts can help future-proof your PKI strategy. Contact us here.