Imagine a world where a single quantum computer could unlock every encrypted file, from bank transactions to government secrets, in minutes. As quantum computing advances, it threatens to break the cryptographic systems that secure our digital lives. While current quantum computers are not yet at this scale, their rapid development necessitates proactive measures.
To counter such quantum attacks, we require a new generation of encryption to safeguard our personal information. This is where Post-Quantum Cryptography (PQC) came into the picture. In 2025, PQC will serve as the foundation of future digital security, driven by industry advancements and directives.
Through this blog, you will understand the need for PQC, addressing the quantum threat, standardized algorithms, real-world applications, and industry’s PQC adoption to remove all your doubts on why PQC is the future, providing a comprehensive understanding to prepare you for the coming cryptographic shift.
Why do We Need PQC?
Our digital infrastructure depends on cryptographic systems like RSA and Elliptic Curve Cryptography (ECC) to secure everything from online transactions to software updates. These systems rely on mathematical problems—factoring large numbers or solving discrete logarithms—that are computationally infeasible for classical computers to crack. Quantum computers, however, operate differently from classical computers. While classical computers use bits (0 or 1), quantum computers use qubits, which can exist in multiple states simultaneously due to quantum superposition. Using qubits that can exist in multiple states simultaneously due to quantum superposition, they can solve certain problems exponentially faster than classical systems.
The primary threat is Shor’s algorithm, developed by Peter Shor in 1994, which can factor large numbers and compute discrete logarithms in polynomial time on a quantum computer. This means the time it takes to solve the problem grows slowly relative to the increase in the size of the input, making even very large problems tractable. This capability could break a 2048-bit RSA key in minutes, a task that would take classical computers billions of years.
Additionally, Grover’s algorithm accelerates brute-force attacks on symmetric cryptography, effectively halving key strength and necessitating longer keys for security. Both algorithms pose distinct but equally significant threats to the foundational security of our digital communications. Experts, including Gartner, warn that quantum computers will become capable of breaking most asymmetric cryptography by 2029. (Gartner Report)
A quantum attack on U.S. financial systems could cause $2-3.3 trillion in indirect GDP losses, while a breach of Bitcoin’s encryption could lead to $3 trillion in losses (Hudson Institute Report). These risks show why PQC is not just a future consideration but an urgent priority for securing digital economies and national security.
PQC Algorithms Standardization
To counter the quantum threat, the cryptographic community has been working tirelessly to develop PQC algorithms that resist both classical and quantum attacks. The National Institute of Standards and Technology (NIST) has spearheaded this effort, evaluating 82 algorithms from 25 countries since 2016. In August 2024, NIST finalized three PQC standards, marking a significant milestone (NIST Report):
- FIPS 203 (ML-KEM): Previously called CRYSTALS-Kyber, this standard supports general encryption and key encapsulation, offering compact keys and fast performance for secure data transmission.
- FIPS 204 (ML-DSA): Derived from CRYSTALS-Dilithium, it’s tailored for digital signatures, which is crucial for applications like code signing to ensure software authenticity.
- FIPS 205 (SLH-DSA): Built on SPHINCS+, this stateless hash-based signature scheme provides a simpler alternative for digital signatures.
In March 2025, NIST added Hamming Quasi-Cyclic (HQC), a code-based algorithm, to its standards as a backup key encapsulation mechanism (NIST HQC Selection). HQC offers a different mathematical foundation compared to Kyber, providing cryptographic diversity and hence an alternative if any unforeseen vulnerabilities emerge in other lattice-based candidates. NIST is also evaluating 15 additional algorithms, with a draft standard for FN-DSA (based on FALCON) expected soon. These standards are ready for immediate adoption, and NIST urges organizations to begin transitioning now, as updating systems can take years.
Beyond the U.S., the European Telecommunications Standards Institute (ETSI) is advancing quantum-safe standards, while the UK’s National Quantum Strategy emphasizes PQC adoption alongside Quantum Key Distribution (QKD) (Report). These regional efforts complement NIST’s work by ensuring global interoperability and developing a unified approach to PQC implementation. This global collaboration ensures PQC algorithms are rigorously checked and universally accepted, forming a solid foundation for a quantum-secure future.
PQC Algorithms for Code Signing
Code signing uses digital signatures to verify software authenticity and integrity. As quantum computers threaten traditional signatures (RSA, ECDSA), PQC algorithms are essential to secure code signing. These algorithms, categorized as Lattice-Based (ML-DSA, FN-DSA) and Hash-Based (SLH-DSA, LMS), offer quantum resistance and align with CA/Browser Forum requirements for hardware-based key storage (HSM). Many tools, such as OpenSSL and various SPHINCS+ testkits, are readily available for developers to experiment with and integrate these new cryptographic standards.
Lattice-Based Algorithms
Lattice-based algorithms rely on complex mathematical problems in high-dimensional lattices, believed to resist quantum attacks. Their efficiency makes them ideal for code signing in high-throughput environments like DevSecOps pipelines.
- ML-DSA, derived from CRYSTALS-Dilithium, is NIST’s primary signature algorithm (FIPS 204). It offers fast signature generation and verification, with signature sizes of 2.4-4.8 KB, balancing security and performance. ML-DSA is a probabilistic signature scheme, meaning each signature on the same message will be different. ML-DSA integrates with tools like CodeSign Secure v3.02 and is compatible with HSMs like nCipher nShield Connect. Also, its moderate resource requirements make it suitable for enterprise workflows, though larger signatures may challenge constrained devices.
- FN-DSA, based on FALCON, is under NIST evaluation for standardization in 2025. It produces smaller signatures (0.6-1.3 KB) than ML-DSA, which is ideal for resource-limited environments like IoT firmware signing. FN-DSA is a deterministic signature scheme, meaning the same message will always produce the same signature. However, FN-DSA’s complex key generation and slower verification make it less suited for high-volume code signing. FN-DSA’s potential inclusion in PKI systems ensures compliance with hardware key storage mandates, enhancing its future role in code signing.
Hash-Based Algorithms
Hash-based algorithms rely on the security of one-way hash functions, offering simplicity and high security.
- The Leighton-Micali Signature (LMS) algorithm, a stateful hash-based scheme (NIST SP 800-208), offers smaller signatures (1-3 KB) and faster verification than SLH-DSA, making it suitable for IoT or embedded device firmware. Its stateful nature requires secure state management to prevent signature reuse, which adds to operational complexity and necessitates reliance on HSMs to securely store private keys and track the number of remaining signatures. It is ideal for long-term security, as its hash-based approach resists quantum attacks, ensuring signed software remains trusted for decades
- SLH-DSA, based on SPHINCS+ (FIPS 205), is a stateless hash-based scheme, eliminating the need to track signature states. This simplicity makes it attractive for open-source projects or firmware updates, where managing state is challenging. However, SLH-DSA’s large signatures (8-16 KB) and slower verification can strain resource-constrained systems.
PQC Adoption
Early adoption of PQC will offer strategic advantages, enabling organizations to maintain customer trust, comply with regulations, and lead in cybersecurity. Crucially, adopting PQC now also future-proofs digital certificates with long lifespans, such as TLS certificates, which might be issued today but remain valid for 5-10 years, ensuring they remain secure against future quantum threats. Industries like finance, healthcare, and telecom benefit from secure data exchange, whereas the tech industry is integrating PQC into their real-world applications to improve user security and safety. A few such instances include Google’s implementation of hybrid PQC in Chrome 116 (August 2023) for secure web browsing, or QuSecure establishing a quantum-resilient satellite link via Starlink in March 2023, securing data across orbits.
Along with industry adoption, governments are also prioritizing PQC to protect critical infrastructure and national security. As per the National Security Memorandum (NSM-10), all the systems will need to be transitioned by the Federal Deadline: 2035, setting a clear timeline for federal agencies. The White House estimates that transitioning federal systems to PQC will cost $7.1 billion between 2025 and 2035.
How Encryption Consulting Can Help?
Encryption Consulting’s CodeSign Secure v3.02 empowers organizations to transition to post-quantum cryptography (PQC) seamlessly, ensuring quantum-resistant code signing for software across platforms like Windows, Linux, and macOS. By integrating NIST-standardized algorithms like ML-DSA and LMS, CodeSign Secure enables developers to sign artifacts with quantum-safe signatures, protecting against future quantum threats. Its support for hybrid signing strategies, combining traditional (RSA/ECC) and PQC algorithms, allows a smooth migration without disrupting existing DevSecOps pipelines.
CodeSign Secure’s client-side hashing and PKCS#11 wrapper enhance security and efficiency. Its scalability supports automated, policy-enforced signing, and it offers seamless integration with popular CI/CD tools like Jenkins, Bamboo, GitLab, and Azure DevOps. This helps organizations future-proof their software supply chains and achieve compliance by becoming quantum-ready.
Complementing this, Encryption Consulting’s Advisory Services provide tailored guidance to prepare for PQC adoption, conducting cryptographic audits to identify quantum-vulnerable systems. These services include risk assessments, compliance strategies, and training for teams to implement hybrid cryptography and upgrade HSMs for PQC code signing.
Conclusion
PQC is the future because it addresses the imminent quantum threat, backed by robust standards, government mandates, and industry innovation. With NIST’s finalized algorithms, PQC is already shaping our digital lives. Organizations and governments are not merely advised but are actively urged to begin their PQC migration strategies now to avoid significant security vulnerabilities and ensure the continued integrity and confidentiality of information in the quantum age. This transition also emphasizes the importance of crypto-agility, i.e., designing systems that can quickly and efficiently swap cryptographic algorithms as new threats emerge or better solutions become available.
By integrating CodeSign Secure into your organization, you can adopt PQC-ready signatures to ensure safe, quantum-resistant code signing, protecting software across platforms while meeting regulatory requirements. We will help you and your organization with strategic guidance, from cryptographic audits to compliance roadmaps, ensuring a seamless PQC transition to protect your data and stay secure.