HQC: A New Post-Quantum Cryptography Standard Based on Codes
- José Darío Flórez Gómez
- Jul 30
- 9 min read
This article presents a technical and educational review of the HQC (Hamming Quasi-Cyclic) algorithm, recently standardized by NIST as a post-quantum encryption scheme. It describes the mathematical foundations of code-based cryptography, the detailed construction of HQC, the underlying computational problem of syndrome decoding in quasi-cyclic codes, and the impact of its standardization with a comparison to Kyber.
Post-quantum cryptography (PQC) aims to develop cryptographic schemes that are resistant to attacks by future quantum computers. Among the various families of PQC algorithms, one of the oldest is code-based cryptography, introduced in 1978 with the McEliece cryptosystem [1]. This approach uses error-correcting codes to construct public-key encryption systems.

In simple terms, it leverages the ability of certain codes to detect and correct errors in messages, using this as the foundation for a cryptographic scheme: the message is transformed into a valid codeword, and then random errors (bit flips) are introduced before transmission.
Only someone with the secret key (i.e., the special knowledge needed to decode the code and correct the errors) can recover the original message.
An adversary, on the other hand, faces the problem of correcting errors without knowing the structure of the code, something that is computationally infeasible for sufficiently large codes [2]
Illustrative example. Suppose we have a simple errorcorrecting code that encodes 4-bit words into 7-bit words by adding parity bits (similar to a Hamming code). If we want to encrypt the message 1011, we first encode it, obtaining, say, 1011010 as the codeword. We then randomly flip some bits (introducing errors) before transmission, for example, flipping the third bit to get the ciphertext 1001010.
At first glance, this ciphertext appears to be a meaningless sequence of bits; however, someone who knows the code used can detect and correct the error (by adjusting the flipped bit), recover 1011010, and from there, the original message 1011.
For an attacker without that knowledge, decoding the ciphertext is as hard as trying all bit combinations until finding a valid codeword, a problem that grows exponentially with the code length. In fact, decoding general linear codes is NP-complete in the general case [8], which underpins the security of these systems even against known quantum computers.
Code-Based Cryptography: Concepts and Background
The idea of using error-correcting codes in cryptography was proposed by Robert McEliece in 1978, employing algebraic codes (Goppa) to construct a public-key encryption scheme. The security is based on the difficulty of decoding (finding the error vector) without the secret key. In the classical McEliece scheme, the public key is essentially a generator matrix of a linear algebraic code hidden by random transformations, and the private key allows correction of up to a certain number of errors.

One traditional drawback of this family has been the large key sizes: for example, in McEliece, the public and private keys can be on the order of hundreds of kilobytes or even megabytes [1], which hindered its practical adoption.
Nevertheless, these systems have withstood decades of cryptanalysis; in fact, the classic McEliece remains unbroken after more than 40 years, making code-based cryptography a conservative and secure bet in the post-quantum world.
With advances in coding theory research, variants emerged aiming to reduce key size by exploiting special structures in the codes. In particular, quasi-cyclic codes and MDPC (Moderate Density Parity-Check) codes were developed, which introduce repetitive patterns (cyclic structure) into the code’s matrix to drastically reduce the length of the public key without severely weakening security.
For example, the BIKE and HQC schemes, proposed within the NIST PQC standardization competition, use high-density binary linear quasi-cyclic codes, where large parity-check matrices are represented with just a few circulant generator vectors. This structure allows the public key to be much more compact than in classic McEliece (which used randomized Goppa codes) [5, 3].
The HQC Post-Quantum Algorithm (Hamming Quasi-Cyclic)
HQC is a public-key encryption scheme (KEM) based on codes that was selected by NIST as a new post-quantum standard in March 2025 [4]. It leverages the structural advantages of binary linear quasi-cyclic codes, and its name “Hamming Quasi-Cyclic” refers to its operation over the Hamming metric (binary errors) and its use of quasi-cyclic codes.
Unlike McEliece, HQC does not hide the family of codes used: the generator matrix of the employed code is public [2]. Its security does not rely on keeping the code secret, but rather on the hardness of certain decoding problems associated with structured random codes.
Key Components
Code Type and Structure: HQC uses a hybrid of two concatenated classical codes: a decodable code C of length n and dimension k (e.g., combinations of Reed- Muller and Reed-Solomon) capable of correcting up toΔ errors, embedded within a length-2n code with a double circulant structure whose parity-check matrix is (1 | h). The public key includes the description of C (generator matrix G) and the vector h, while the private key contains two vectors (x, y) of length n and low Hamming weight. From them, the public vector is computed.
s = x + h · y en F2[X]/(Xn − 1).
Underlying Hard Problem: The security of HQC relies on QCSD (Quasi-Cyclic Syndrome Decoding): given a syndrome s = H eT produced by a quasi-cyclic code with parity-check matrix H = (1, h) and an error vector e of low weight, it is computationally infeasible to recover e without secret information
This problem is a structured variant of the classic SD (Syndrome Decoding) problem, nown to be NP-complete [8]. HQC does not introduce any hidden trapdoor in the code; its security reduces to the hardness of SD in quasi-cyclic codes, thus offering a transparent structure suitable for formal analysis [2].
Of course, any encryption scheme based on errors must deal with the possibility of decoding failures (when noise exceeds the correction capacity). HQC was carefully designed to minimize this decryption failure rate (DFR).
In fact, NIST highlighted that HQC achieved an extremely lowestimated DFR (on the order of< 2−128) while meeting IND-CCA2 security levels, outperforming another candidate such as BIKE, whose DFR analysis remained uncertain [4, 3]. This means that the probability of a valid ciphertext failing to decrypt with the correct key due to excessive errors is practically zero.
Encryption and Decryption (High-Level View): HQC is typically implemented as a KEM, but its encryption core can be described in simplified form:
1. Public key: (h, s) and the generator matrix G of the code C.
2. Private key: (x, y) are secret low-weight vectors. To encrypt a message m of k bits:
3. Encoding and randomization: Compute mG ∈ Fn2,
y se generan aleatoriamente dos vectores r1, r2 de peso wr y un vector de error e de peso we [2].
4. Computation of ciphertext components: u = r1+h· r2, v = Trunc (mG+s · r2+e )
The resulting ciphertext pair is c = (u, v) [2].
5. Decryption: With key (x, y) the receiver computes
v′ = v − u · y = mG + e − r1 y,
and applies the decoder of C to recover m [2].
Impact of Standardization and Comparison with Kyber (ML-KEM) and Other Algorithms
The standardization of HQC by NIST in 2025 marks an important milestone in post-quantum cryptography. HQC thus becomes the fifth PQC algorithm selected by NIST for its portfolio of standards, and the second KEM alongside Kyber (renamed in the standard as ML-KEM) [7].
Broadly speaking, Kyber (ML-KEM) is a scheme based on modular Euclidean lattices, chosen for its excellent performance and security based on well-studied lattice problems [6]. HQC, in contrast, represents a different mathematical lineage (codes) and was selected as a “backup algorithm” for general encryption [4]
Dustin Moody, lead cryptographer of the NIST PQC project, explained that the goal is to have two independent approaches in case a vulnerability is discovered in one of them in the future [5]. In this way, HQC adds redundancy and diversity to the cryptographic arsenal: if any quantum or mathematical breakthrough weakens the security of structured lattices (on which Kyber is based), error-correcting codes would serve as a resilient alternative.
Comparison with Kyber (ML-KEM)
NIST makes it clear that HQC is not intended to replace Kyber, which remains the primary KEM due to its efficiency and broad compatibility [3]. Kyber offers very fast encryption/ decryption and small key and ciphertext sizes; for example, a public key in Kyber-768 (classical security ∼ 192 bits) is approximately 1184 bytes, and the ciphertext is about 1088 bytes [6].
In HQC at an equivalent security level (∼ 192 bits), the public key is around 4522 bytes and the ciphertext approximately 9042 bytes [2], which are significantly larger (at level 1, HQC has pk ≈ 2.2 KB vs. ∼ 800 B for Kyber512).
In other words, HQC pays a size penalty: its public keys and ciphertexts can be 3–4 times larger than those of Kyber for similar security levels.
Likewise, HQC’s encryption operations are slower (involving manipulation of polynomials thousands of bits long and code decoding), although still within manageable ranges; according to evaluations, HQC’s encapsulation speed is approximately half that of BIKE [3], and compared to Kyber, HQC is also slower, but still adequate for many practical uses (in optimized implementations, operations may take on the order of microseconds to a few milliseconds) [7].
Security and Theoretical Robustness
Where HQC shines is in its security and theoretical robustness; its analysis is considered very mature and solid [7, 3]. NIST evaluated that HQC showed advantages over other codebased candidates: for example, BIKE shared similar foundations, but its DFR analysis was less complete, generating uncertainty [3], and Classic McEliece, although extremely secure, carries public keys on the order of hundreds of kilobytes to megabytes [1], which limits its practical adoption in many environments.
In fact, NIST indicated that McEliece might be standardized separately by ISO to avoid duplicated efforts [5]. HQC, with its intermediate approach, achieves a balanced compromise between size and performance: thanks to the quasi-cyclic structure, its parameters are relatively manageable (keys and ciphertexts much smaller than McEliece, only modestly larger than Kyber), and its algorithms sufficiently efficient, especially with hardware optimizations due to its linear and bit-parallel nature [7].
In terms of relative advantages and disadvantages compared to Kyber and other PQC standards:
Cryptographic Diversity: HQC’s greatest advantage is providing diversification in mathematical assumptions. Kyber (ML-KEM) is based on algebraic lattices; if a specific quantum attack against lattices were discovered tomorrow, all implementations based on them would be at risk. HQC, being based on codes, would remain secure against such a hypothetical attack vector, offering a second line of defense and improving the crypto-agility of the ecosystem [5].
Proven Hardness: Code decoding problems have been studied for decades. There are no known efficient quantum algorithms that solve them significantly faster than classical ones (unlike factoring or discrete logarithms, which are broken by Shor’s algorithm). HQC leverages this historical hardness and, by not hiding any trapdoors in its design, facilitates formal analysis and confidence in its resilience [2, 7].
Key Size and Performance: HQC’s main disadvantage compared to Kyber is its bandwidth efficiency and computation time. Kyber requires less bandwidth (smaller keys and ciphertexts) and is very fast thanks to polynomial arithmetic using FFT [6]. HQC, although optimized, handles vectors of several kilobytes (e.g., 10–12KB per transaction vs. 2KB for Kyber), and its decoding operations can be more expensive [2]. Nonetheless, these sizes and times remain manageable in most applications.
Reliability and Error Handling: Kyber is practically deterministic and does not present decryption failures. HQC, involving probabilistic decoding, introduces a DFR (Decryption Failure Rate), but this has been reduced to negligible levels (<10−38 under standard parameters), making HQC as reliable as Kyber for normal use cases [4].
Conclusion
The standardization of HQC by NIST marks a significant step in the search for robust cryptography for the quantum era. HQC, as a modern representative of code-based cryptography, demonstrates that these long-standing principles (error correction and hard decoding) remain relevant and effective against the most advanced threats.
With a clever construction that combines classical codes with error-masking techniques, HQC achieves a balance between security and efficiency, narrowing the gaps that have historically separated code-based schemes from their lattice-based counterparts.
In the post-quantum landscape, HQC brings mathematical diversity and confidence: its security hypothesis is based on problems that have withstood the scrutiny of researchers for decades [3]. By joining Kyber as an alternative standard, it reinforces the message that no single mathematical construction is infallible, but with multiple approaches we can secure our cryptographic systems against the unknown [4].
Organizations and developers will be able to implement HQC to increase the resilience of their communications, for example, in critical infrastructure or long-term systems where an extra layer of protection is desired against future cryptanalytic advances.
HQC also revitalizes interest in code-based cryptography, a field with a long lineage but often overshadowed by other techniques. Its success in the NIST competition, prevailing over candidates such as BIKE and Classic McEliece, teaches us that with ingenuity it is possible to mitigate classical limitations (such as key sizes) and achieve practical standards based on codes.
Challenges remain from optimizing implementations to managing associated patents (which, as reported, will be licensed royalty-free to facilitate adoption) [5].
Ultimately, HQC emerges as an essential component in the post-quantum cryptographic toolbox. With its adoption, we enter a new stage where encryption schemes from two distinct schools (lattices and codes) will coexist to protect information. This duality increases confidence that our data will remain secure even in the face of the eventual rise of quantum computing.
As is often the case in cryptography, prudence suggests not “putting all your mathematical eggs in one basket.” HQC offers precisely that alternative basket—rooted in error correction and the complexity of decoding—ready to keep our secrets safe in the post-quantum future [3].
At Cyte, we invite the community to explore and integrate these new tools of post-quantum cryptography as part of their security strategies, thus ensuring a reliable transition toward a world resilient to the power of quantum computing.
Fuentes
[1] Wikipedia, “McEliece cryptosystem,” https://en.wikipedia.org/wiki/McEliece_cryptosystem.
[2] “Especificación de HQC (2025),” PQC-HQC.org, https://pqc-hqc.org.
[3] Redacción Watchdata, “Cuarta ronda del PQC del NIST: ¡HQC seleccionado como nuevo candidato al KEM!,” https: //www.watchdata.com/es/1007/.
[4] NIST, “NIST Selects HQC as Fifth Algorithm for Post-Quantum Encryption (news release),” 2025, https://www.nist.gov/news-events/news/2025/03/nist-selects-hqc-fifth-algorithm-post-quantum-encryption.
[5] NIST, “NIST PQC Standardization Process: Fourth Round Status Report (IR 8545),” 2025. https://csrc.nist.gov/pubs/ir/8545 /final
[6] Wikipedia, “Kyber (cryptography),” https://en.wikipedia.org/wiki/Kyber.
[7] PQShield, “FIPS 203 – Module-Lattice KEM (ML-KEM),” 2024, https://csrc.nist.gov/pubs/fips/203/final.
[8] N. Sendrier, “Decoding One Out of Many (DOOM),” in PQCrypto 2011, PKI Consortium, 2011. https://www.researchgate .net/publication/220961279_Decoding_One_Out_of_Many
If you'd like to always have the article by José Darío Flórez handy, we invite you to download it, share it, and tell us what you think.
Comments