Methodology
How the registry evaluates post-quantum cryptographic algorithms. Each algorithm is assessed across standardization maturity, security properties, risk profile, performance, and implementation readiness.
Scope
The registry tracks post-quantum cryptographic algorithms relevant to practical deployment. Algorithms are included based on standardization progress, adoption potential, or significance to specific ecosystems (e.g. blockchain).
6
Digital signature schemes
2
Key encapsulation mechanisms
Assumption Families
Every algorithm is classified by the mathematical hardness assumption it relies on. This determines the algorithm's fundamental security posture and informs the assumption risk rating.
Security relies on the hardness of problems in high-dimensional lattices, such as finding short vectors (SVP) or distinguishing noisy linear equations (LWE). Offers strong performance and compact keys, with decades of cryptanalytic study.
Security relies solely on the properties of cryptographic hash functions (collision resistance, preimage resistance). Considered the most conservative assumption family because no structural algebraic assumption is required beyond hash function security.
Security relies on the hardness of decoding random linear error-correcting codes. The McEliece cryptosystem (1978) is the oldest post-quantum proposal in this family, providing strong confidence through decades of study.
Security relies on the hardness of computing isogenies between elliptic curves. A newer family offering compact key sizes but with less mature cryptanalytic history. The SIDH attack in 2022 demonstrated the importance of ongoing analysis.
Standardization Status
The registry records standardization progress from recognized bodies including NIST, IETF, and relevant ecosystem organizations. Each algorithm's status reflects the maturity of its formal specification.
- Standard
- Published as a final standard by a recognized body (e.g. NIST FIPS, IETF RFC). Suitable for production deployment.
- Draft
- Standard is in development with a designated reference number. The specification is largely stable but may still undergo minor revisions before finalization.
- Candidate
- Selected for standardization but the formal specification has not yet been published. Subject to change during the standardization process.
- Research
- Active academic proposal or early-stage design. Not yet adopted by a standardization body. Included in the registry for tracking and comparison.
Security Properties
Each algorithm is characterized by four security properties that describe its formal guarantees and operational requirements.
- Hardness Assumption
- The specific mathematical problem that must remain hard for the scheme to be secure (e.g. Module-LWE, QCSD). Distinct from the broader assumption family.
- Security Notion
- The formal security guarantee the scheme provides. Digital signatures target EUF-CMA (existential unforgeability under chosen message attack). KEMs target IND-CCA2 (indistinguishability under adaptive chosen ciphertext attack).
- Deterministic
- Whether the core operation (signing or encapsulation) produces the same output given the same inputs. Non-deterministic schemes use internal randomness, which can provide hedging against fault attacks but requires a reliable entropy source.
- Statefulness
- Whether the signer must maintain state between operations. Stateful schemes (e.g. XMSS) require careful index tracking — reusing state can be catastrophic. Stateless schemes are simpler to deploy but may have larger signatures.
Risk Assessment
Risk assessments use a conservative three-tier scale across three independent dimensions. Definitions are intentionally narrow to avoid ambiguity. Ratings are assigned based on published cryptanalysis, implementation track record, and inherent algorithmic properties.
Confidence in the underlying mathematical hardness assumption.
- Well-studied assumption with decades of scrutiny and no known sub-exponential attacks (classical or quantum). Examples: Module-LWE at standard parameters.
- Assumption is broadly believed to hold, but has less cryptanalytic history or relies on structured variants that narrow the safety margin.
- Assumption is novel, has limited independent analysis, or recent results have weakened confidence in related problems.
Difficulty of producing a correct, standards-conformant implementation.
- Straightforward to implement correctly. Reference code is mature, few edge cases, and misuse-resistant API design.
- Requires careful attention to parameter encoding, rejection sampling, or key validation. Subtle bugs have appeared in early implementations.
- Complex state management, fragile failure modes, or a history of implementation-level vulnerabilities across multiple libraries.
Susceptibility to timing, power, or electromagnetic side-channel attacks.
- Core operations are naturally constant-time or trivially hardened. No known exploitable leakage in standard deployment models.
- Constant-time implementation is achievable but requires explicit effort (e.g., masked arithmetic, constant-time rejection sampling). Non-hardened code is vulnerable.
- Inherent algorithmic features (e.g., variable-weight operations, secret-dependent branching) make side-channel resistance difficult even for expert implementers.
Performance Evaluation
Performance is captured at two levels of detail. All algorithms receive qualitative relative ratings. When available, quantitative benchmarks are included with platform and source attribution.
Each core operation is rated on a three-point scale relative to other algorithms of the same primitive type.
Digital signatures are rated on keygen, sign, and verify. KEMs are rated on keygen, encaps, and decaps.
When available, quantitative benchmarks report wall-clock timing in microseconds for each operation and parameter set. Each benchmark entry includes the hardware platform and source implementation to ensure reproducibility. Benchmarks are not available for all algorithms.
Parameter Sets & Sizes
Each algorithm defines one or more parameter sets at different NIST security levels. The registry records the concrete byte sizes for all cryptographic artifacts.
NIST security level classifications define the minimum computational effort required to break a scheme, benchmarked against symmetric primitives.
Digital Signatures
- Public key (pk_bytes)
- Secret key (sk_bytes)
- Signature (sig_bytes)
KEMs
- Public key (pk_bytes)
- Secret key (sk_bytes)
- Ciphertext (ct_bytes)
- Shared secret (ss_bytes)
Capabilities
Capabilities describe functional properties beyond core cryptographic operations. These help identify algorithms suited to specific deployment scenarios.
- batch-verification
- Multiple signatures can be verified simultaneously faster than verifying each individually.
- threshold-signatures
- The signing key can be split among multiple parties, requiring a threshold of participants to produce a valid signature.
- aggregation
- Multiple signatures can be combined into a single compact proof of validity.
- hedged-signing
- Signing incorporates both deterministic and randomized components, providing resilience against both fault injection and poor randomness.
- hardware-friendly
- Operations map efficiently to constrained hardware (HSMs, smart cards, embedded devices) without requiring large memory or complex arithmetic.
- hybrid-mode
- Can be deployed alongside a classical algorithm (e.g. ECDSA + ML-DSA) in a hybrid construction for defense-in-depth during migration.
- forward-secrecy
- Compromise of the current key does not allow decryption of past sessions or forgery of past signatures.
- key-agreement
- Can be used to establish a shared secret between two parties for symmetric encryption.
- snark-aggregation
- Signatures are designed for efficient aggregation via STARK/SNARK proofs, enabling scalable on-chain verification.
Implementation Tracking
The registry catalogs known implementations for each algorithm, recording the implementation name, language, source link, and whether the code has undergone a formal security audit.
- Audited
- The implementation has been reviewed by an independent security auditor. This does not guarantee the absence of vulnerabilities, but indicates a higher level of scrutiny.
- Unaudited
- No known formal audit. The implementation may still be high quality, but has not been independently verified. Exercise additional caution for production use.
Contributing
The PQ Crypto Registry is open source. Algorithm data, risk assessments, and methodology are maintained in structured YAML files with automated validation. Contributions are welcome via pull request.