Skip to content

Common PKI Setup Issues: Grayed Out Enterprise CA Button 

Introduction

We have discussed common PKI setup issues in the past, and today we tackle a more common one that you may see. When going through the server manager and following the steps to create an Enterprise CA, you will come across an option to select the type of CA you are attempting to setup. In some cases, the Enterprise CA option may be grayed out, even though it seems like you have everything in place. The steps below will walk you through remediating this issue.   

Error Handling

error guide

While setting up an Issuing CA, if the Enterprise CA option is unclickable, it could be because the credentials of user you are trying to setup the CA with doesn’t have enough permissions to complete this task. The user must have the Enterprise Admin role enabled to continue setup. To fix this issue, follow the below steps: 

  • First navigate to sysdm.cpl from run to verify if your machine is domain joined.
  • navigate to sysdm.cp
  • If your machine is domain joined it should show domain here, if not join your machine to domain.
  • machine is being domain joined
  • Navigate to your Active Directory Users and to Enterprise Admins to verify if the account you’re using is listed or not.
  • Active Directory Users
    Active Directory Users
  • Create a new user in enterprise admin role to login with that for ADCS. Right Click on User ->New->User.
  • Create a new user
  • Provide user details and create a password.
  • Create a password
  • Click on finish to complete adding the Enterprise Admin. Please note your user logon name.
  • adding the Enterprise Admin
  • Right Click on user you created and navigate to properties.
  • Navigate to properties
  • Navigate to Member Of and click on Add.
  • Navigate to Member Of
  • In the “Enter the object names to select” box, add Domain Admins and Enterprise Admins. Click on check names every time you add a name.
  • add Domain Admins and Enterprise Admins
  • Once done click on Apply to finish.
  • Apply to finish
  • In the ADCS configuration window, on Credentials page click on Change to change the credentials.
  • change the credentials
  • Provide the username and password of the account you created.
  • Provide the username and password
  • You should be able to see the Enterprise CA option available after this. Follow the setup steps after this to complete your ADCS configuration.
  • Enterprise CA option enabled

Conclusion 

This may be a common issue, but the process to fix it can be a bit more lengthy than past errors we have discussed. Sometimes, finding the reasoning behind why an error is occurring can be more complicated than actually fixing the error. At Encryption Consulting, we work with your organization to plan, implement, and troubleshoot any PKI setup you may want to do. To learn more about how we can help your organization, please reach out to [email protected] or www.encryptionconsulting.com

Common PKI Setup Issues: Web Enrollment HTTPS Error 401.2

A big part of setting up your PKI is ensuring web enrollment so that certificates can be distributed to users. There are many different issues that can occur, but one of the more common ones is the 401.2 HTTPS error you see below. This occurs when you have set a certificate for HTTPS communication, but it does not have SAN values that it needs in it. The steps below will walk you through remediating this issue.

Error Handling

Once you have done your web enrollment and when navigating to your ADCS site, the https certificate might be seen as unavailable or your site will be shown as “Not Secure”. This can happen if the SAN attribute isn’t properly added to your certificate request.

https-error-screenshot

Run the following command to set the proper flag and allow SAN Attributes to be used in your web certificate.

certutil -setreg policy\EditFlags +EDITF_ATTRIBUTESUBJECTNAMES2

certutil-command

Restart your Active Directory Certificate services.

net stop certsvc && net start svc

restart service command

Once done redo the issuing of the web certificate and assign it to your webpage again.

Enterprise PKI Services

Get complete end-to-end consultation support for all your PKI requirements!

Conclusion

This is one of the more common web enrollment issues we run into with PKI. This is generally an easier than fix than most, so it is not very time consuming to take care of it. At Encryption Consulting, we work with your organization to plan, implement, and troubleshoot any PKI setup you may want to do. To learn more about how we can help your organization, please reach out to [email protected] or www.encryptionconsulting.com.

Common PKI Setup Issues: 0x80070005 (WIN32: 5 ERROR_ACCESS_DENIED)

No matter what your experience level is, when setting up a PKI you can run into many issues. They may be more common issues or issues you have never seen before, so understanding how to handle these types of errors is very important.  Part of the PKI setup process is running multiple certutil commands from a command prompt. In our example today we will discuss the access denied error you may run into when running any of these commands.

Error Handling

This ldap: 0x32: LDAP_INSUFFICIENT_RIGHTS or the 0x80070005 (WIN32: 5 ERROR_ACCESS_DENIED)  error are common errors you may run into when trying to run different certutil commands, even though you are on an Administrator command prompt. Luckily, the fix is as simple as checking the permissions on the account attempting to run this command. This error is likely coming from the issue of your logged in account not having the right permissions to run this command. As long as you switch to an account that has Enterprise Admin rights, and open a new Administrator command prompt, rerunning this command should clear up this issue.

Error Handling Screenshot

Enterprise PKI Services

Get complete end-to-end consultation support for all your PKI requirements!

Conclusion

As you can see, some of these fixes can be extremely simple. Other issues, however may cause more issue or require more in-depth fixes. At Encryption Consulting, we work with your organization to plan, implement, and troubleshoot any PKI setup you may want to do. To learn more about how we can help your organization, please reach out to [email protected] or www.encryptionconsulting.com.

In-Depth Analysis of FIPS 205: Stateless Hash-Based Digital Signature Standard

The Federal Information Processing Standards Publication (FIPS) 205 introduces a groundbreaking approach to digital signatures with its Stateless Hash-Based Digital Signature (SLH-DSA) Standard. Our blog presents the framework for digital signatures, leveraging hash-based techniques to enhance both security and efficiency in cryptographic systems. 

Overview of SLH-DSA 

FIPS 205, or the Stateless Hash-Based Digital Signature Standard, represents a significant evolution in digital signature technology. This standard focuses on using hash-based methods to generate and verify digital signatures without maintaining state information between operations. The stateless design is a key feature that helps address various cryptographic security concerns.

Core Components of SLH-DSA 

Stateless Operation

The SLH-DSA framework is designed to operate without the need to maintain internal state information. This stateless approach is crucial for enhancing security because it eliminates vulnerabilities associated with state management. By avoiding state retention, SLH-DSA reduces the risk of attacks that exploit state information.

Cryptographic Functions

SLH-DSA employs several fundamental cryptographic functions to ensure the integrity and security of digital signatures:

  • Hash Functions

    Central to SLH-DSA are hash functions that process data to produce fixed-size hash values. These hash values are essential for verifying the integrity of data and ensuring that the digital signature is valid and untampered.

  • Pseudo-Random Functions (PRFs)

    PRFs are utilized to generate secure random values that are critical for various cryptographic operations. The use of PRFs strengthens the security of the signature generation process by ensuring that the values used are unpredictable and random.

Security Enhancements

The FIPS 205 standard incorporates several key enhancements to improve security over previous digital signature standards:

  • Address Types

    New address types, such as WOTS_PRF and FORS_PRF, have been defined for the WOTS+ and FORS secret key value generation. These additions address vulnerabilities in earlier key generation methods, providing more secure ways to handle secret keys.

  • Algorithmic Updates

    The standard replaces SHA-256 with SHA-512 in several functions, including H𝑚𝑠𝑔, PRF𝑚𝑠𝑔, H, and Tℓ, to address identified weaknesses in SHA-256, particularly for higher security categories. SHA-512 offers enhanced security properties, making it suitable for more demanding applications.

  • Mitigation Strategies

    To further improve security, FIPS 205 introduces new methods to counteract multi-target long-message second preimage attacks. These strategies help protect against sophisticated attacks that exploit weaknesses in the hashing process.

Parameter Sets and Approvals

The standard carefully selects and approves parameter sets to ensure compliance with security requirements:

  • Approved Parameter Sets

    FIPS 205 specifies the use of only 12 out of 36 parameter sets defined in previous specifications. The focus is on ‘simple’ instances of SHA2 and SHAKE parameter sets, which are deemed to meet current security standards effectively.

    1. SHA2-based
      • SLH-DSA-SHA2-128s
      • SLH-DSA-SHA2-128f
      • SLH-DSA-SHA2-192s
      • SLH-DSA-SHA2-192f
      • SLH-DSA-SHA2-256s
      • SLH-DSA-SHA2-256f
    2. SHAKE-based
      • SLH-DSA-SHAKE-128s
      • SLH-DSA-SHAKE-128f
      • SLH-DSA-SHAKE-192s
      • SLH-DSA-SHAKE-192f
      • SLH-DSA-SHAKE-256s
      • SLH-DSA-SHAKE-256f

Note: The parameter sets for SLH-DSA include “s” and “f” to indicate the optimization goals: 

“s” = Small signatures: This means the parameter set is optimized for reducing the size of the digital signature, which is useful for saving bandwidth and storage. However, this often comes at the cost of slower signing operations. 

“f” = Fast signing: This means the parameter set is optimized for faster signature generation. It is ideal for systems where signing speed is critical, but it typically results in larger signature sizes that require more bandwidth and storage.  

NIST recommends SHAKE-based sets for long-term use, as they offer greater flexibility and stronger standardization. 

PQC Advisory Services

Prepare for the quantum era with our tailored post-quantum cryptography advisory services!

Revisions and Adaptations

FIPS 205 includes several revisions from earlier versions to enhance clarity and functionality: 

  • Domain Separation

    The updated standard includes domain separation cases for signing both raw messages and message digests. This change accommodates different scenarios in digital signature operations and improves the flexibility of the standard.

  • Bit Extraction Methods

    Adjustments have been made to the methods for extracting bits from message digests used in key generation. These changes align with reference implementations and address ambiguities in previous specifications.

Practical Applications 

FIPS 205 is highly relevant for organizations and systems that rely on digital signatures for securing information and communications. Its stateless approach provides significant advantages in security, reducing the risks associated with stateful systems. By implementing SLH-DSA, organizations can enhance the security of their digital transactions and ensure the integrity of their communications. 

Key Benefits

  • Enhanced Security: The stateless design and improved cryptographic functions provide robust protection against various types of attacks. 
  • Flexibility: The standard’s updates and revisions allow for greater adaptability to different operational scenarios. 
  • Compliance: Adopting FIPS 205 helps organizations meet stringent security requirements and align with best practices in digital signature technology. 

How Encryption Consulting Can Help? 

As quantum computing advances, the need to secure your organization’s data and communications against future threats is more pressing than ever. We offer our post quantum cryptographic advisory services to navigate these challenges effectively. Our services include: 

  • Assessment

    We conduct thorough evaluations of your current cryptographic infrastructure to identify vulnerabilities and prepare for quantum threats. This includes assessing digital certificates, cryptographic keys, and overall crypto-governance.

  • Strategy

    We develop a customized roadmap for transitioning to post-quantum cryptographic solutions, ensuring your organization’s data remains secure. Our strategies are designed to align with your specific needs and risk tolerance.

  • Implementation

    We support the seamless integration of quantum-resistant cryptographic solutions into your existing systems. This includes planning, executing pilots, and ensuring compliance with the latest standards.

Conclusion 

FIPS 205 represents a major advancement in digital signature standards with its Stateless Hash-Based approach. The incorporation of enhanced cryptographic functions, updated algorithms, and refined parameter sets a new standard for security in digital signatures. By leveraging SLH-DSA, organizations can ensure greater protection for their digital communications and transactions, aligning with modern security requirements and best practices. 

Understanding FIPS 204: The Module-Lattice-Based Digital Signature Standard 

The Federal Information Processing Standards Publication (FIPS) 204 introduces the Module-Lattice-Based Digital Signature Standard. This standard is designed to address the growing need for security in an era where traditional cryptographic methods may be vulnerable to quantum computing attacks. Here’s a detailed overview of FIPS 204, its purpose, and its implications for modern cryptographic practices. 

What is FIPS 204? 

FIPS 204 is a standard developed by the National Institute of Standards and Technology (NIST) that defines a lattice-based digital signature algorithm called ML-DSA (Module-Lattice-Based Digital Signature Algorithm). Unlike traditional cryptographic standards, which rely on mathematical problems that are vulnerable to quantum attacks, FIPS 204 utilizes lattice-based cryptography, a field that offers promising resistance to such emerging threats

The goal of FIPS 204 is to provide a robust digital signature method, ML-DSA, that maintains security in the face of quantum computing advancements. 

Sizes of keys and signatures of ML-DSA 

ML-DSA-44

  • Private Key: 2560 bytes
  • Public Key: 1312 bytes
  • Signature Size: 2420 bytes
  • RBG strength required: Should be at least 192 bits (recommended), but must be at least 128 bits.

Note: You may use an RBG with 128-bit security. But if it is less than 192 bits, then the overall security classification (NIST-defined security levels that map to post-quantum cryptographic strengths) of ML-DSA-44 is downgraded from category 2 to category 1. Therefore, if you use a weaker RBG (128 bits), then even if the algorithm is designed for category 2 (192-bit security), you can only claim category 1 (128-bit security).

ML-DSA-65

  • Private Key: 4032 bytes
  • Public Key: 1952 bytes
  • Signature Size: 3309 bytes
  • RBG strength required: 192-bits

ML-DSA-87

  • Private Key: 4896 bytes
  • Public Key: 2592 bytes
  • Signature Size: 4627 bytes
  • RBG strength required: 256-bits

Key Features and Objectives 

  1. Quantum Resistance

    The primary driver behind FIPS 204 is to offer a cryptographic solution resistant to quantum computing attacks. Quantum computers have the potential to solve complex mathematical problems that underpin current cryptographic algorithms like RSA and ECC, making them vulnerable to future breaches. Lattice-based cryptography, the foundation of FIPS 204, is believed to be secure against these quantum threats, thus providing a higher level of future-proof security.

  2. Lattice-Based Cryptography

    FIPS 204 employs lattice-based cryptography, which involves complex geometric structures known as lattices. These lattices are used to construct algorithms that are computationally challenging to break, even with the advanced capabilities of quantum computers. The strength of lattice-based methods lies in their resistance to attacks that can undermine traditional cryptographic systems.

  3. Digital Signature Algorithm

    The standard specifies a digital signature algorithm that enables secure authentication and integrity verification of digital messages. Digital signatures are essential for ensuring that the information has not been altered and verifying the identity of the sender. FIPS 204 provides a detailed framework for generating and validating these signatures, ensuring reliability and security.

  4. Interoperability

    By setting a standardized approach for lattice-based digital signatures, FIPS 204 promotes interoperability across various systems and platforms. Organizations that adopt this standard can ensure their digital signatures work seamlessly within different environments, enhancing compatibility and ease of integration.

  5. Implementation Guidelines

    FIPS 204 offers comprehensive guidelines for the practical implementation of lattice-based digital signatures. This includes procedures for key generation, signature creation, and verification processes. Adhering to these guidelines helps ensure that cryptographic implementations are robust, secure, and consistent with high security standards.

PQC Advisory Services

Prepare for the quantum era with our tailored post-quantum cryptography advisory services!

Implications for Security and Compliance

  1. Enhanced Security

    The introduction of FIPS 204 represents a significant step towards bolstering digital security. The lattice-based approach offers a higher level of protection against potential future threats from quantum computing. Organizations adopting this standard can better safeguard their data and communications, making it more resilient to advanced attacks.

  2. Regulatory Compliance

    FIPS standards, including FIPS 204, are essential for compliance with federal regulations. Organizations operating under such regulations must adhere to these standards to demonstrate their commitment to security. Implementing FIPS 204 helps ensure that an organization meets these regulatory requirements and maintains a high standard of data protection.

  3. Future-Proofing

    FIPS 204 is a forward-looking standard that addresses the evolving landscape of cryptographic threats. By integrating lattice-based cryptography, organizations can future-proof their digital security measures, preparing for potential advances in technology that could otherwise compromise traditional cryptographic systems.

  4. Strategic Adoption

    Adopting FIPS 204 is a strategic move for organizations looking to stay ahead of the curve in cryptographic security. As quantum computing continues to develop, having a lattice-based digital signature solution in place positions organizations to effectively handle emerging threats and maintain secure operations.

How Encryption Consulting Can Help with FIPS 204 and Post-Quantum Cryptography 

Our post-quantum cryptography services are designed to secure your data and communications as quantum technology advances. 

  • Risk Assessment: We evaluate your current cryptographic systems to identify vulnerabilities and assess risks related to quantum threats, including potential impacts on your digital certificates and cryptographic keys. 
  • Quantum Readiness Roadmap: We create a tailored strategy and roadmap to guide your transition to quantum-resistant cryptography. Our approach ensures you’re prepared for emerging threats and compliant with industry best practices. 
  • Seamless Implementation: We manage the implementation of post-quantum solutions, from proof of concept to full deployment, ensuring a smooth transition and compliance with NIST standards. 

Conclusion 

FIPS 204 marks a significant advancement in the field of digital signatures by incorporating lattice-based cryptography. This standard is designed to enhance security in an era where traditional cryptographic methods may fall short due to the rise of quantum computing. By adopting FIPS 204, organizations can benefit from robust, future-proof digital signature solutions that ensure data integrity and security. As we enter the quantum era, FIPS 204 provides a solid foundation for addressing both current and future security challenges, reinforcing the importance of proactive and resilient cryptographic practices. 

In-Depth Overview of FIPS 203: The Module-Lattice-Based Key-Encapsulation Mechanism Standard

The Federal Information Processing Standards (FIPS) 203 publication introduces the Module-Lattice-Based Key-Encapsulation Mechanism (ML-KEM) Standard, which provides a cutting-edge cryptographic framework designed to secure data against emerging quantum computing threats. Our blog explores the core elements, parameter sets, implementation differences, and practical considerations outlined in the FIPS 203 standard. 

Introduction to ML-KEM 

ML-KEM is a key encapsulation mechanism (KEM) used to protect symmetric keys, which are crucial for encrypting and decrypting data. The standard defines three primary operations: 

  • Key Generation (KeyGen): This operation generates a pair of keys, a public key and a private key. The public key is distributed for encryption purposes, while the private key is kept secure and used for decryption. 
  • Encapsulation (Encaps): Using the public key, this process produces a ciphertext that contains a symmetric key. This ciphertext can be safely transmitted over insecure channels. 
  • Decapsulation (Decaps): With the private key, this operation retrieves the symmetric key from the ciphertext. The symmetric key is then used for subsequent encryption or decryption tasks. 

Parameter Sets

FIPS 203 specifies three parameter sets for ML-KEM, each tailored to different security levels and performance characteristics:

ML-KEM-512

  • Security Level: Provides a baseline level of security suitable for many standard applications. 
  • Key and Ciphertext Sizes: Offers a balance between security and performance, with encapsulation keys of 800 bytes, decapsulation keys of 1632 bytes, ciphertexts of 768 bytes, and a fixed 32-byte shared secret key.

ML-KEM-768

  • Security Level: Enhances security compared to ML-KEM-512, making it suitable for more sensitive applications. 
  • Key and Ciphertext Sizes: Includes larger encapsulation keys of 1184 bytes, decapsulation keys of 2400 bytes, ciphertexts of 1088 bytes, and a 32-byte shared secret key, balancing greater security with increased data sizes.

ML-KEM-1024

  • Security Level: Provides the highest level of security among the three parameter sets, ideal for highly sensitive or long-term protection needs. 
  • Key and Ciphertext Sizes: Features the largest encapsulation keys of 1568 bytes, decapsulation keys of 3168 bytes, ciphertexts of 1568 bytes, and a 32-byte shared secret key, which may affect performance due to the increased data sizes. 

Each parameter set includes variables that determine the size of matrices and vectors used in the key generation and encryption processes. These parameters are crucial for tailoring the cryptographic operations to different security and performance requirements. 

Key and Ciphertext Sizes

The FIPS 203 standard specifies the sizes of keys and ciphertexts for each parameter set, which directly impact the amount of data handled during encryption and decryption:

ML-KEM-512

  • Encapsulation Key: 800 bytes 
  • Decapsulation Key: 1632 bytes 
  • Ciphertext: 768 bytes 
  • Shared Secret Key: 32 bytes 
  • Required RBG strength: 128-bits atleast

ML-KEM-768

  • Encapsulation Key: 1184 bytes 
  • Decapsulation Key: 2400 bytes 
  • Ciphertext: 1088 bytes 
  • Shared Secret Key: 32 bytes 
  • Required RBG strength: 192-bits atleast 

ML-KEM-1024

  • Encapsulation Key: 1568 bytes 
  • Decapsulation Key: 3168 bytes 
  • Ciphertext: 1568 bytes 
  • Shared Secret Key: 32 bytes 
  • Required RBG strength: 256-bits atleast 

These sizes reflect the amount of data involved in the cryptographic processes and influence both the security and performance of the system. 

Differences from CRYSTALS-Kyber

FIPS 203 builds on the CRYSTALS-Kyber scheme, incorporating several key updates and modifications: 

  • Fixed Shared Secret Length

    Unlike CRYSTALS-Kyber, which allowed for variable-length shared secret keys, ML-KEM specifies a fixed length of 256 bits. This standardization simplifies integration and use, providing a consistent size for the shared secret key across applications.

  • Updated Fujisaki-Okamoto Transform

    ML-KEM employs a modified version of the Fujisaki-Okamoto transform. This update excludes the hash of the ciphertext in the derivation of the shared secret, aligning with current security practices to streamline the process.

  • Randomness Handling

    Previous versions of the algorithm required hashing of initial randomness to ensure its quality. ML-KEM removes this step, relying instead on NIST-approved randomness generators to guarantee sufficient randomness without additional processing.

  • Input Validation

    The standard introduces explicit checks for input validity that were not present in earlier versions. For example, ML-KEM verifies that the encapsulation key decodes correctly from its byte array, ensuring proper format and integrity.

PQC Advisory Services

Prepare for the quantum era with our tailored post-quantum cryptography advisory services!

Updates from Initial Draft

The final version of FIPS 203 incorporates several revisions based on feedback from the initial public draft: 

  • Domain Separation

    To prevent the misuse of keys across different security levels, domain separation is introduced in the key generation process. This ensures that keys intended for one security level cannot be mistakenly used for another, enhancing overall system security.

  • Correction of Matrix Indices

    Errors related to matrix indices in the initial draft were corrected to align with the original CRYSTALS-Kyber specification. This adjustment ensures accuracy and consistency in the implementation of ML-KEM.

Practical Implementation Considerations

When implementing ML-KEM, consider the following practical aspects: 

  • Selecting a Parameter Set

    Choose the parameter set that best matches your security requirements and performance constraints. Higher security parameter sets offer better protection but may impact system performance due to increased data sizes or processing requirements.

  • Performance vs. Security Trade-offs

    Understand the balance between security and performance. Stronger security settings provide greater protection but may result in slower performance or larger data sizes. Assess your specific needs to determine the most appropriate parameter set.

  • Compliance

    Ensure that your implementation adheres to the specifications outlined in FIPS 203. Compliance with these standards is crucial for achieving secure key encapsulation and maintaining data protection.

How Encryption Consulting Can Help 

We provide end-to-end post quantum cryptographic algorithms that are customized to meet your organization’s unique requirements and help you adapt to the quantum era.  

  • Quantum Risk Evaluation: Identify vulnerabilities in existing encryption protocols and key management systems. 
  • Quantum Readiness Roadmap: Develop a tailored strategy for transitioning to quantum-resistant solutions, aligned with NIST and other standards. 
  • Customized Security Measures: Implement security measures based on data sensitivity and criticality. 
  • Implementation Support: Provide assistance with the transition to post-quantum cryptographic algorithms, including Proof of Concept development and vulnerability assessments. 
  • Visibility and Compliance: Enhance visibility into cryptographic practices and ensure compliance with industry standards. 
  • Future-Proofing: Adapt to emerging quantum threats with flexible models and ongoing monitoring to maintain long-term resilience. 
  • Expert Consultation: Benefit from our specialized tools and best practices for robust cryptographic security. 

Conclusion 

FIPS 203 and the ML-KEM standard represent significant advancements in cryptographic technology, particularly in preparing for potential future threats posed by quantum computing. By understanding the parameter sets, differences from previous schemes, and practical considerations, organizations can effectively implement ML-KEM to enhance their data protection strategies. For detailed guidance, book a one-to-one session to understand how we can help you meet the best practices and compliance. 

GlobalSign Public CA Integration with CertSecure Manager v3.1

Certificate management is a challenge that many IT professionals know all too well. From chasing down renewals to fixing configuration errors and keeping up with the demands of a growing company, it can feel like a never-ending battle. For many enterprises, managing certificates issued by public Certificate Authorities (CAs) remains fragmented and labor-intensive. With the CertSecure Manager, Encryption Consulting has addressed this challenge head-on through native integration with GMO GlobalSign’s Atlas platform. This post dives into the real struggles of certificate management and shows how CertSecure Manager’s automation and template management features can make life easier for IT teams who rely daily on GlobalSign Public CA for issuing and managing public trusted certificates. 

Addressing the Public CA Management Challenge

Organizations today operate across complex, multi-cloud, and hybrid environments. Certificates are issued from both internal (private) and external (public) CAs. While private CAs offer greater control, public CAs like GlobalSign are critical for securing externally facing assets like web applications, APIs, mobile backends, and third-party integrations. 

However, managing public CA certificates often involves: 

  • Manual CSR generation and tracking 
  • Repetitive domain validation 
  • Separate tooling and portals for issuance and renewal 
  • Inconsistent enforcement of certificate policies 
  • High risk of unexpected expirations or configuration drift 

First, there’s the constant pressure of renewals. Certificates expire at different times and missing just one can lead to website downtime or security vulnerabilities. Then there are the configuration errors. A small mistake in setting up a certificate can compromise the entire security setup, turning what should be a secure connection into a potential risk. And for companies that are scaling quickly, the number of certificates to manage can grow exponentially. Many IT professionals find themselves spending more time managing certificates than focusing on their core responsibilities.  

CertSecure Manager v3.1 bridges this gap by offering automated, policy-driven, and audit-ready certificate issuance directly from GlobalSign Atlas APIs, from within our unified CLM platform. 

What’s New in CertSecure Manager v3.1 with GlobalSign Public CA Integration 

The GlobalSign CA integration in CertSecure Manager v3.1 is designed to support enterprise-scale certificate issuance without adding operational burden. Key capabilities include: 

  1. Certificate Issuance from GlobalSign Atlas

    CertSecure Manager integrates with GlobalSign’s Public CA to issue publicly trusted certificates for web servers, applications, and APIs. Organizations can request, approve, and retrieve certs from Atlas with minimal configuration.

  2. Template-Based Certificate Requests

    Using CertSecure Manager’s template management, teams can define reusable certificate configurations including crypto algorithms, key sizes, usage extensions, and validity periods. This eliminates human error and ensures consistency across large deployments.

  3. Pre-Validation & Domain Management

    CertSecure Manager leverages Atlas’s domain validation model to support pre-validated domains, eliminating repeated challenges during issuance. The result: faster provisioning and fewer deployment delays.

  4. Automated Renewals

    Certificates issued via GlobalSign are tracked by CertSecure Manager’s internal lifecycle engine. Renewal agents automatically initiate revalidation (if required), request new certificates, and deploy them across configured endpoints reducing the risk of unexpected expirations.

  5. Centralized Visibility and Audit Logging

    All GlobalSign-issued certificates are inventoried in CertSecure Manager’s dashboard, along with internal certs. Teams get real-time visibility, expiration tracking, role-based access control (RBAC), and detailed audit trails critical for compliance and governance

Here’s how it works, CertSecure Manager v3.1 automates the entire certificate lifecycle from issuance to renewal to revocation. This means no more manually tracking expiration dates or worrying about configuration mistakes. One standout feature which has been introduced is the in-platform certificate template management. Pre-configured templates make it easy to issue certificates for different use cases, whether it’s securing a website or managing internal applications. Simply select the appropriate template, and the system takes care of the rest. It’s a straightforward way to ensure consistency and avoid errors, saving time and reducing stress.

How to Integrate GlobalSign CA with you CertSecure Manager 

API Credential Setup and mTLS Certificate

Generate GlobalSign Atlas mTLS certificate and API credentials linked to an active service and a valid identity from the GMO GlobalSign Atlas console. More information is available here at GlobalSign Support Page

Connector Configuration in CertSecure Manager 

  • Go to the CertSecure Manager UI and download the GlobalSign connector installer from Utilities>Connectors>Download. 
  • Once the connector is configured with all the required details, you can complete the CA integration by navigating to Administration > CA Management, clicking Add CA, and entering the necessary information. 

Certificate Request Workflow

  • A PKI Admin creates a pre-configured certificate template in Inventory > Template Management > Certificate Templates. 
  • Users or systems request certificates via CertSecure Manager
  • Requests follow the organization-defined policies and workflows. 
  • Upon successful validation of policies, the platform calls GlobalSign Atlas APIs to issue the certificate. 

Certificate Deployment & Tracking 

  • Certificates can be automatically deployed to target devices using Renewal Agents Integration. CertSecure Manager supports a wide range of servers including IIS, Apache, F5, Tomcat, MongoDB, Microsoft SQL Server, Oracle Database Server, etc. 
  • CertSecure Manager tracks certificate metadata, monitors expiration, and automates renewal.  

Certificate Management

Prevent certificate outages, streamline IT operations, and achieve agility with our certificate management solution.

Conclusion 

While certificate management may not always take center stage in IT operations, its impact is undeniable. A single expired certificate can disrupt services, undermine security, and erode user trust. That’s why having a dependable, automated, and unified Certificate Lifecycle Management (CLM) platform is a necessity for most organizations. 

With the latest release, CertSecure Manager extends its capabilities through seamless integration with GlobalSign Atlas APIs, enabling organizations to manage public and private certificates with ease, consistency, and full lifecycle visibility. Whether you’re securing web applications, managing internal services, or navigating multi-cloud deployments, CertSecure Manager offers centralized control without added complexity. 

As a vendor-neutral platform supporting integrations with Microsoft ADCS, DigiCert, GlobalSign, Sectigo, EJBCA, and more, CertSecure Manager positions itself as the single source of truth for your enterprise certificate operations. 

If you’re looking to eliminate manual processes, reduce the risk of outages, and bring consistency to your certificate strategy, contact us at [email protected] or request a demo by visiting https://www.encryptionconsulting.com

The Role of CLM in Enterprise Security

Digital trust is foundational to enterprise operations and the management of digital certificates has become a mission-critical task. Certificates are the backbone of secure communications, enabling encryption, authentication, and data integrity across networks, applications, and devices. 

However, the landscape is rapidly evolving. In 2022, Google announced a move toward 90-day certificate validity for public TLS certificates, citing the need for enhanced agility and reduced exposure to compromised keys. This shift is part of a broader industry trend toward shorter certificate lifespans, which, while improving security, also significantly increases the operational burden on organizations. 

On 11 Apr 2025, the CA/Browser Forum voted for and approved Ballot SC-081v3, a proposal led by Apple to shorten the lifespan of SSL/TLS certificates progressively over the next few years, culminating in a maximum lifespan of 47 days by 2029. Without automation and centralized oversight, this frequency can easily lead to missed renewals, expired certificates, and service outages. 

These developments underscore the critical need for Certificate Lifecycle Management (CLM), a structured, automated approach to managing certificates throughout their lifecycle. CLM not only helps prevent costly outages and security breaches but also ensures compliance, operational efficiency, and alignment with modern security frameworks like Zero Trust. 

What is Certificate Lifecycle Management?

Certificate Lifecycle Management refers to the end-to-end process of managing digital certificates, from issuance and deployment to renewal and revocation. It ensures that certificates are always valid, trusted, and compliant with security policies.

The lifecycle typically includes the following stages: 

  1. Discovery  – Identifying all certificates across the enterprise. 
  2. Enrollment  – Requesting and issuing certificates. 
  3. Provisioning  – Deploying certificates to the appropriate systems. 
  4. Monitoring  – Tracking certificate status and expiration. 
  5. Renewal  – Replacing certificates before they expire. 
  6. Revocation  – Invalidating compromised or unused certificates. 

Why CLM Matters in Enterprise Security

  1. Preventing Outages and Downtime

    Expired certificates can cause application failures, website outages, and service disruptions. In 2021, a major cloud provider experienced a global outage due to an expired certificate, highlighting the critical need for proactive certificate management.

    CLM tools provide automated alerts and renewals, ensuring certificates are updated before expiration.

  2. Mitigating Security Risks

    Certificates are often targeted by attackers to impersonate trusted entities or intercept encrypted traffic (man-in-the-middle attacks). Poorly managed certificates, such as those that are self-signed, weak, or expired, can become entry points for cyber threats.

    CLM enforces policy-based issuance, strong cryptographic standards, and revocation of compromised certificates, reducing the attack surface.

  3. Ensuring Compliance

    Regulations like PCI-DSS, HIPAA, GDPR, and SOX require secure data transmission and identity verification. CLM helps enterprises maintain audit trails, enforce policies, and demonstrate compliance during security assessments.

  4. Supporting Zero Trust Architecture

    In a Zero Trust model, every entity must be authenticated and authorized before accessing resources. Certificates play a key role in device and user authentication. CLM ensures that these certificates are valid, trusted, and up to date, enabling secure access control.

Key Components of an Effective CLM Strategy

Certificate Discovery and Inventory

Many organizations lack visibility into their certificate landscape. A robust CLM solution should: 

  • Scan networks to discover all certificates (internal and external). 
  • Classify certificates by type, issuer, and expiration. 
  • Maintain a centralized inventory with metadata and ownership. 

Automation and Orchestration

Manual certificate management is error-prone and inefficient. Automation enables: 

  • Auto-enrollment and provisioning via APIs or integrations. 
  • Scheduled renewals and revocations. 
  • Integration with DevOps pipelines and CI/CD tools. 

Policy Enforcement

CLM tools should enforce enterprise-wide policies such as: 

Monitoring and Alerting

Real-time monitoring helps detect: 

  • Expiring or expired certificates. 
  • Unauthorized certificate issuance. 
  • Certificate anomalies or misconfigurations. 

Alerts can be integrated with SIEM tools or incident response platforms. 

Integration with Identity and Access Management (IAM)

Certificates are often used for machine identities, user authentication, and API security. CLM should integrate with IAM systems to: 

  • Issue certificates based on user roles or device trust. 
  • Revoke certificates when users leave or devices are decommissioned. 
  • Support multi-factor authentication (MFA) and single sign-on (SSO). 

Challenges in Certificate Lifecycle Management

Despite its importance, CLM comes with challenges: 

  • Certificate Sprawl: Enterprises may manage thousands of certificates across hybrid environments. 
  • Shadow IT: Teams may issue certificates without IT oversight. 
  • Lack of Standardization: Different teams may use different CAs or tools. 
  • Shorter Lifespans: Modern certificates have shorter validity, increasing renewal frequency. 

These challenges make centralized and automated CLM not just a best practice, but a necessity. 

Benefits of Implementing CLM

BenefitDescription
Reduced Risk Prevents outages and security breaches due to expired or misused certificates. 
Operational Efficiency Automates repetitive tasks and reduces manual errors. 
Improved Visibility Centralized dashboard for all certificates across the enterprise. 
Regulatory Compliance Ensures adherence to industry standards and audit readiness. 
Enhanced Trust Maintains the integrity of digital identities and secure communications. 

Certificate Management

Prevent certificate outages, streamline IT operations, and achieve agility with our certificate management solution.

Future of CLM: AI and Machine Learning

The next evolution of CLM involves AI-driven insights and predictive analytics. Future platforms may: 

  • Predict certificate failures based on usage patterns. 
  • Recommend optimal certificate configurations. 
  • Detect anomalies in certificate issuance or usage. 

As enterprises adopt IoT, edge computing, and multi-cloud architectures, CLM will become even more critical in managing  machine identities at scale. 

How could Encryption Consulting help?  

One of the most comprehensive solutions in the CLM space is CertSecure Manager by Encryption Consulting. Designed to address the growing complexity of certificate environments, CertSecure Manager offers a centralized, automated, and policy-driven approach to certificate lifecycle management. 

Key Features of CertSecure Manager

  • Centralized Certificate Inventory: Automatically discovers and inventories certificates across cloud, on-prem, and hybrid environments. 
  • Automated Lifecycle Management: Handles issuance, renewal, and revocation of certificates with minimal human intervention. 
  • Policy Enforcement Engine: Ensures compliance with enterprise security policies and industry standards. 
  • Role-Based Access Control (RBAC): Provides granular access management to ensure only authorized users can manage certificates. 
  • Integration with Leading CAs and DevOps Tools: Seamlessly integrates with public and private Certificate Authorities, as well as CI/CD pipelines. 
  • Real-Time Monitoring and Alerts: Offers dashboards and alerts for expiring or misconfigured certificates. 
  • Audit and Reporting: Maintains detailed logs and reports for compliance and forensic analysis.

Benefits for Enterprises

  • Reduced Risk of Outages: Automated renewals and alerts prevent service disruptions. 
  • Improved Security Posture: Enforces strong cryptographic standards and revokes compromised certificates swiftly. 
  • Operational Efficiency: Reduces manual workload and human error. 
  • Scalability: Supports large-scale environments with thousands of certificates. 

CertSecure Manager is particularly well-suited for organizations adopting  Zero Trust, DevSecOps, and cloud-native architectures, where certificate sprawl and short lifespans are common challenges. 

Additionally, Encryption Consulting’s PKI-As-A-Service helps your organization to simplify your PKI deployment with end-to-end certificate issuance, automated lifecycle management, policy enforcement, and seamless compliance with industry security standards. 

Conclusion

Certificate Lifecycle Management is no longer a niche IT function, it is a strategic pillar of enterprise security. As digital ecosystems grow in complexity, the need for visibility, automation, and control over digital certificates becomes paramount. 

By investing in a robust CLM strategy, enterprises can prevent outages, reduce risk, ensure compliance, and build a foundation of trust in their digital operations. 

Securing the Future of Code Signing with CNSA 2.0 Compliance and PQC

Introduction

The speedy advancements of quantum computing are no longer a theoretical concern, rather, it is an impending threat to classical cryptography. Algorithms such as Shor’s, which can factor large integers and compute discrete logarithms exponentially faster than any classical algorithm, threaten to break algorithms such as RSA and ECC. Likewise, usage of Grover’s algorithm leads to the weakening of symmetric encryption by effectively halving key strength, causing a threat to the integrity of cryptographic systems using shorter key lengths. The implications of these are significant to foundational security protocols such as TLS, VPNs, digital signatures, and particularly code signing.

Recognizing this, the U.S. National Security Agency (NSA) has introduced and published the Commercial National Security Algorithm Suite 2.0 (CNSA 2.0). This cryptographic effort mandates the use of post-quantum cryptographic (PQC) algorithms for securing national security systems (NSS) and classified communications. CNSA 2.0 is not merely a recommendation, but a strategic shift, backed by government policy, aimed at hardening systems against both current and future cryptanalytic capabilities.

CNSA 2.0 explicitly names ML-KEM (for key encapsulation) and ML-DSA (for digital signatures) as required PQC algorithms, both of which were selected by NIST in its PQC standardization process for their strong security and performance characteristics. These algorithms are lattice-based, relying on mathematical problems believed to be resistant to quantum attacks, unlike RSA and ECC. In this blog, we’ll examine the details of CNSA 2.0 and explore its algorithmic foundations. We’ll also showcase how Encryption Consulting’s Code Signing Solution, CodeSign Secure, empowers organizations to maintain trust, integrity, and compliance in software distribution as they transition into the post-quantum era.

What Is CNSA 2.0?

CNSA 2.0, or the Commercial National Security Algorithm Suite 2.0, is the latest suite of cryptographic algorithms mandated by the U.S. National Security Agency (NSA) for securing National Security Systems (NSS) in the post-quantum era. Released in September 2022, CNSA 2.0 represents a significant shift in cryptographic strategy, explicitly designed to prevent risks posed by both classical and quantum adversaries. It supersedes the older CNSA 1.0 and Suite B protocols, aligning national security cryptographic standards with modern quantum-resilient initiatives.

Unlike NIST’s Post-Quantum Cryptography (PQC) standardization, which aims to provide algorithms for general-purpose commercial use, the NSA has set clear and urgent expectations for the transition to PQC through CNSA 2.0, particularly in high-trust environments like code signing. The suite compels the use of quantum-resistant algorithms as they offer strong security even against quantum computing capabilities, and simultaneously phases out legacy algorithms such as RSA, DSA, and finite-field DH. Whether you’re leading product security, managing DevOps pipelines, architecting cryptographic systems, or ensuring regulatory compliance, this shift impacts you.

Explore our in-depth blog post to know more about CNSA 2.0 in detail.

CNSA 2.0 Cryptographic Algorithms

To address the dual challenges of advanced classical threats and future quantum adversaries, the CNSA 2.0 suite introduces a set of cryptographic algorithms according to the different use cases within National Security Systems (NSS). These algorithms are categorized based on their application, from software and firmware signing to general-purpose public-key cryptography.

1. Algorithms for Software and Firmware Signing

One of the most important use cases covered under CNSA 2.0 is code signing. Code signing is the process of digitally signing software, firmware, or updates to prove they are authentic and have not been modified.

As PQC gains urgency, hash-based signature algorithms stand out for their maturity, security guarantees, and NSA endorsement under the CNSA 2.0 suite. Two primary algorithms, Leighton-Micali Signature (LMS) and eXtended Merkle Signature Scheme (XMSS), form the backbone of trusted code signing to protect National Security Systems (NSS) in a post-quantum world.

These algorithms differ fundamentally from traditional RSA and ECDSA schemes. While the latter relies on number-theoretic assumptions (e.g., discrete logarithm or integer factorization problems), LMS and XMSS rely on the security properties of cryptographic hash functions, such as preimage resistance, collision resistance, and second preimage resistance. These properties remain quantum-attack resilient even under Grover’s algorithm, which only offers a quadratic speedup, therefore, providing the highest level of security.

Both LMS and XMSS are stateful signature schemes. This means that each signature requires unique internal state information that must be managed securely and updated atomically after every signature operation. Unlike RSA or ECDSA, where the same private key can be used to generate multiple signatures, LMS/XMSS keys are tied to a fixed number of valid signatures. This is because reusing a key state or signing two different messages with the same state can compromise security, allowing attackers to forge additional signatures.

In practice, this introduces strict requirements for:

  • State persistence and rollback protection: Especially in embedded or firmware environments, systems must prevent rollback to a previous signing state (e.g., due to power loss or system crashes).
  • Atomic signing operations: This refers to the detection of any interruption during signing and the ability to recover to avoid reuse.
  • Audit logging and key usage tracking: A signature counter must be maintained and protected to ensure integrity across reboots or system migrations.

LMS

LMS, originally developed by Leighton and Micali, is optimized for constrained environments such as bootloaders, smart cards, and hardware security modules (HSMs). It uses a hierarchical structure of Merkle trees where each leaf is associated with a one-time signature (OTS). The scheme supports parameter sets that allow tuning trade-offs between performance, size, and security.

AspectDetails
Signature Size1.2 KB – 3 KB
PerformanceLMS is computationally faster than XMSS, making it attractive for real-time signing operations on devices with limited CPU or memory.
Use CaseIdeal for environments where secure and efficient firmware signing is critical (e.g., IoT, BIOS/UEFI).

One of LMS’s advantages is its stateless verifier model. This means verification routines do not require maintaining any state, making LMS signatures easy to validate even in minimal environments such as ROM-based bootloaders or air-gapped devices.

XMSS

XMSS, specified in RFC 8391, offers more features over LMS and introduces forward-security with pseudorandom key generation and optional key randomization. It is suitable for use cases that demand longer-term cryptographic assurance and more sophisticated key management mechanisms.

AspectDetails
Signature SizeTypically, 2 KB to 5 KB, depending on parameter sets and security levels.
Security PropertyProvides forward security as it uses a binary Merkle tree with hash chains to derive one-time keys, ensuring each signature remains unforgeable even if some internal state is leaked.
Use CaseSuitable for high-assurance environments requiring strong key protection.

XMSS’s computational overhead is higher than LMS, which makes it more appropriate for higher-assurance systems with strong processing capabilities, such as firmware signing servers, secure code distribution services, or enterprise PKI infrastructures transitioning to PQC.

Here’s a quick comparison for LMS and XMSS, two NIST SP 800-208 standardized, hash-based digital signature schemes designed for secure firmware and software signing:

AlgorithmFunctionSpecificationParameters
Leighton-Micai Signature (LMS)Asymmetric algorithm for digitally signing firmware and softwareNIST SP 800-208All parameters approved for all classification levels. SHA-256/192 recommended.
eXtended Merkle Signature Scheme (XMSS)Asymmetric algorithm for digitally signing firmware and softwareNIST SP 800-208All parameters approved for all classification levels.

2. Quantum-Resistant Public-Key Algorithms

With the rise of quantum computing, traditional public-key schemes such as RSA, DH, ECDSA, and ECDH are no longer considered future-proof. As part of the CNSA 2.0 roadmap, NSA has defined a suite of quantum-resistant public-key algorithms to guide future NSS deployments. While NIST’s final FIPS standardization for these algorithms is pending, NSA’s early announcement enables developers, vendors, and NSS operators to begin planning and building accordingly.

Since public-key algorithms are the core of code signing, digital signatures generated with these algorithms guarantee the authenticity and integrity of software and firmware. Therefore, CNSA 2.0 specifically advises the immediate adoption of hash-based signature schemes, such as Leighton-Micali Signature (LMS) and eXtended Merkle Signature Scheme (XMSS), for code signing. These are already standardized and approved for use in National Security Systems (NSS).

AlgorithmFunctionSpecificationParameters
ML-KEMAsymmetric algorithm for key establishmentFIPS 203Use Level V parameters for all classification levels
ML-DSAAsymmetric algorithm for digital signaturesFIPS 204Use Level V parameters for all classification levels

Adapting CNSA 2.0 in Code Signing practices 

As quantum computing progresses toward practical possibilities, traditional cryptographic mechanisms, especially those relying on RSA and ECC, face existential risk due to their vulnerabilities. Code signing, a foundational element of software supply chain security, is particularly sensitive to these changes. It ensures that software, firmware, and configuration updates originate from a trusted source, are not tampered with in transit, and cannot be repudiated by the signer.

Therefore, under CNSA 2.0, firmware signing is identified as the ‘highest-priority signature use case’ in the post-quantum transition. The urgency originates from the fact that in many systems, the firmware validation algorithm is fixed at deployment, often residing in immutable hardware or boot-level firmware. The selected algorithms: LMS and XMSS, are already standardized by NIST in Special Publication 800-208, unlike some other post-quantum signatures that are still under evaluation and NSA explicitly mentions the need for immediate implementation of these post-quantum signature schemes. Furthermore, CNSA 2.0 also highlights the importance of NIST’s newly-approved algorithms, ML-KEM for key exchange and ML-DSA for signatures, once they’re fully standardized and supported in hardware and software.

As the NSA recommends its quantum-resistant cryptography (QRC) transition roadmap through the CNSA 2.0, organizations building or supporting National Security Systems (NSS) must now align their code signing implementations with a precise set of algorithms and follow operational constraints.

While CNSA 2.0 provides cryptographic direction, understanding how to meet policy mandates, especially those laid out in CNSSP 15, NSM-10, and associated CNSS/NIAP documentation, is critical for commercial vendors. Therefore, if your organization develops software or firmware for NSS, you’re not only required to use the right cryptographic algorithms suite (CNSA 2.0), but also to follow multiple policy rules issued by various security authorities.

The cryptographic posture required for any product, especially one performing sensitive code signing or signature validation, depends on its classification and use case.

For Type 1 equipment (typically used in classified or tactical systems), cryptographic implementations are governed by a trio of foundational documents: CJCSN 6510.04, CNSSAM 01-07 and NSM-5 guides cryptographic modernization for tactical and classified systems, explicitly requiring cryptographic algorithms suitable for code signing operations that ensure secure firmware authentication and integrity.

Additionally, these documents together require the use of CNSA 2.0-approved cryptographic tools and give clear guidance on when and how to use them, especially in cases where firmware must stay secure and is hard to update once deployed.

On the commercial side, particularly for vendors aiming to serve NSS or NIAP (National Information Assurance Partnership)-validated environments, compliance must align with the policy directives mentioned:

  • CNSSP 15: lists the approved CNSA 2.0 algorithms that must be followed by code signing keys and processes. This ensures that all digital signatures on software used in National Security Systems (NSS) are strong enough to resist future quantum attacks.
  • CNSSP 11 and NSM-10: These policies require the use of quantum-safe algorithms like LMS and XMSS in your systems and tell you exactly when and where to use them (e.g., for signing firmware).
  • CNSSP 156: This policy defines the official migration period (2025–2030) to switch from old cryptography (CNSA 1.0) to the more secure CNSA 2.0 standard, including cryptographic algorithms used specifically for code signing, while allowing flexibility for systems that are expensive or difficult to upgrade.

A critical takeaway is that if your system is expected to be used after 2030, you must use CNSA 2.0-approved algorithms from the start. CNSA 1.0 remains acceptable for certain legacy systems, but only where short-term cryptographic validity or operational feasibility justifies its continued use.

Algorithm Selection for Software vs. Firmware Signing

NSA handles software and firmware signing as distinct use cases, which originate from three technical considerations:

1. Standards Maturity

Hash-based signature algorithms, LMS and XMSS, were standardized earlier by NIST via SP 800-208, and have CAVP validation available, making them the current approved options for software and firmware signing under CNSA 2.0, while other quantum-resistant signatures may not be as easily available for integration.

These algorithms are “stateful,” which means they require careful management of one-time-use keys. They are especially suitable for long-term systems like firmware in embedded or constrained devices, where updating the signature process later might not be practical. Because LMS and XMSS are already commercially available and validated, the NSA recommends using them now rather than waiting for newer algorithms to become available.

ML-DSA, a stateless and lattice-based signature algorithm, is also approved under CNSA 2.0, but it may not be as easily available for integration. While it can be used for all signing use cases, including software and firmware, it is expected to be more useful in scenarios where a large number of signatures are needed or where signing happens in a distributed environment. Once validated, ML-DSA becomes widely available, it may be the preferred choice for many organizations. However, any implementation of ML-DSA or ML-KEM must strictly follow FIPS 203 and 204 to be considered CNSA 2.0 compliant.

2. Urgency

NSA has prioritized firmware signing due to the deeply embedded nature of its cryptographic roots, such as root certificates, public keys, or secure boot keys, which are often hardcoded into hardware and cannot be easily updated once deployed. Therefore, securing embedded firmware with quantum-safe signatures is an urgent and critical priority.

3. Performance Alignment

LMS/XMSS impose higher performance costs (e.g., larger signature sizes, slower operations), but firmware signing is infrequent and localized, therefore making them ideal choices. High-throughput software environments may later adopt ML-DSA once validated.

Therefore, NSA currently approves only LMS and XMSS for signing use in NSS environments. Multi-tree variants such as HSS (Hierarchical Signature Scheme) and XMSSMT (XMSS Multi-Tree), which are already included in NIST SP 800-208, are not yet permitted for NSS, likely due to complexity in their multi-tree state management.

As part of the CNSA 2.0 transition, vendors are expected to begin integrating LMS and XMSS signature verification into BIOS, UEFI, and embedded bootloaders. These hash-based signature schemes offer quantum-resilient protection and are currently the only PQC algorithms approved by the NSA for use in NSS.

Note: Validated ML-DSA will eventually become the preferred solution for high-throughput and distributed signing environments due to its statelessness, efficiency, and strong mathematical foundation in structured lattices.

However, NSA’s position is clear: the LMS/XMSS transition must begin now, especially for firmware use cases, given:

  • The expected delay in ML-DSA validation and tooling availability,
  • The long hardware lifecycles in critical NSS systems,
  • The opportunity cost of waiting may result in post-quantum insecure deployments that cannot be upgraded.

Once validated, ML-DSA will offer operational advantages, including:

  • Better scalability across CI/CD pipelines,
  • Simplified key management without state tracking,
  • Reduced signature size overhead compared to LMS/XMSS in some configurations.

More about Hash-Based Algorithms

SHA-384 and SHA-512 in CNSA 2.0

CNSA 2.0 states that SHA-2 (SHA-384 and SHA-512) selections are sufficient for security, and their widespread adoption in the commercial world ensuring seamless interoperability across systems. It mentions SHA-384 as a core approved hash function for securing, based on its demonstrated strength and NSA’s internal analysis  and SHA-512 has been explicitly added to CNSA 2.0 for scenarios where performance optimizations are critical. This is because its 64-bit word structure is particularly advantageous on 4 modern 64-bit processors, often delivering faster throughput with comparable security guarantees.

However, deploying SHA-512 introduces an important consideration, i.e., interoperability. When integrating third-party or legacy systems that default to SHA-384, developers must ensure alignment across components to prevent failures in signature verification, HMAC generation, or message digest compatibility.

Use of Other Hash Functions

There are special conditions under which other hash functions are permissible:

  • Truncated SHA-2 Variants (e.g., SHA-256/192): When a cryptographic algorithm (approved by NSA or NIST) explicitly defines use of such truncated variants within its construction. For example, in LMS, they are allowed.
  • SHA-3 Family (SHA3-384, SHA3-512): While not generally approved for general-purpose NSS use, these are acceptable in internal hardware processes, such as random number generation or key derivation within chips.

For instance, if a hardware-isolated, secure execution environment performs Key Derivation Function (KDF) operations internally using SHA3-512 without exposing the hash externally, this falls within acceptable practice boundaries.

Key Considerations for Integrating LMS and XMSS into Code Signing Workflows

As organizations transition to quantum-safe code signing using hash-based signature schemes like LMS (Leighton-Micali Signature) and XMSS (eXtended Merkle Signature Scheme), several unique operational challenges must be addressed to ensure security and workflow continuity.

1. HSM compatibility

Most commercial HSMs are optimized for RSA or ECDSA, which don’t need to keep track of anything between signatures. But LMS and XMSS are different, as they require maintaining signature state (such as counters or tree indices) securely within the module. Because of this, many HSMs need firmware updates or special add-ons to properly support these new types of signatures. Therefore, support for LMS/XMSS is emerging and often requires firmware upgrades or specialized modules. Without this, keys might be handled outside the HSM, making them less secure.

2. State synchronization in distributed environments

In modern DevOps and CI/CD pipelines, code signing is often distributed across multiple nodes or servers. LMS and XMSS’s stateful nature means that each signature consumes a unique part of the private key’s signing capacity, and reuse of state parameters can compromise the entire key. This demands a carefully coordinated approach where the signing state is consistently synchronized across all signing nodes. 

3. Secure state backups

Backing up LMS or XMSS private keys involves more complexity than traditional key backups because the current signing state (e.g., signature counters or Merkle tree indices) must be preserved accurately and securely. The backup process must be tamper-evident and resistant to rollback or replay attacks, as restoring an outdated state can lead to signature reuse and compromise security. Therefore, organizations often employ custom tools or secure enclaves to protect this sensitive state information and ensure that backups remain consistent with ongoing signing operations. 

HSM Integration for Quantum-Safe Code Signing

As organizations begin utilizing quantum-resistant digital signatures such as LMS and XMSS, Hardware Security Modules (HSMs) play a crucial role in enforcing security, compliance, and operational continuity.

Unlike traditional cryptographic algorithms, LMS and XMSS use one-time signature keys derived from a master private key (seed). Each signature must use a unique derived key, tracked using a secure counter. The HSM ensures that:

  • A finite number of signatures is generated per key pair
  • The one-time-use rule is enforced via internal state management
  • Derived keys are not reused or exported inappropriately

Each private key derivation is deterministic and based on a counter, meaning each derived private key can only produce one signature. This unique requirement places operational demands on the cryptographic system. HSMs enforce this state internally and prevent unauthorized resets or duplication of the signing environment, which would otherwise compromise the integrity of LMS/XMSS.

The Role of Hybrid Cryptography

As the cybersecurity industry gets ready for the post-quantum era, hybrid cryptographic solutions, those that combine classical algorithms like RSA or ECC with post-quantum cryptographic (PQC) algorithms, have become a popular transition strategy. These solutions aim to protect data in both the current and future threat landscape.

This dual-layer approach ensures that if a powerful quantum computer breaks a classical algorithm, the PQC layer still keeps the data secure. On the other hand, if the new PQC algorithms show unexpected weaknesses, the classical layer continues to offer a level of protection. Hybrid cryptography helps balance risk and security while PQC standards are still being tested and adopted.

However, the NSA, through its CNSA 2.0 guidance, does not require hybrid solutions for National Security Systems (NSS). The agency has confidence in the strength of the approved PQC algorithms and encourages a direct move to these quantum-safe standards.

That said, the NSA acknowledges that some industry standards may temporarily require hybrid implementations, especially due to the larger key and signature sizes of PQC algorithms. Still, it warns that hybrid systems can introduce added complexity and compatibility issues. As a result, hybrid cryptography is considered a temporary measure, not a long-term solution.

How Encryption Consulting Can Help? 

Encryption Consulting helps enterprises and governments implement CNSA 2.0-aligned signing infrastructures with full PQC and hybrid crypto support. 

CodeSign Secure v3.02 supports PQC out of the box, giving organizations a head start in adapting to the next era of cryptography without sacrificing usability or performance. It’s a smart move now and a necessary one for the future. 

Moving to CNSA 2.0 isn’t just about selecting the right algorithm. It’s about building an end-to-end code signing strategy that protects keys, automates workflows, enforces policy, and ensures compliance. That’s exactly what CodeSign Secure was built for.  

Here’s how CodeSign Secure supports CNSA 2.0:  

  • LMS & XMSS-Ready: Already supports the post-quantum signature schemes required for software and firmware signing.
  • HSM-Backed Key Protection: Your private keys stay protected inside FIPS 140-2 Level 3 HSMs, ensuring no exposure.
  • State Tracking Built-In: Automatically manages state for LMS and XMSS to ensure every signature is compliant.
  • DevOps Friendly: Integrates natively with Jenkins, GitHub Actions, Azure DevOps, and more.
  • Policy-Driven Security: Use RBAC, multi-approver (M of N) sign-offs, and custom security policies to control every aspect of your code signing.
  • Audit-Ready Logging: Get full visibility into every signing operation for easy reporting and compliance.

Whether you’re signing software for Windows, Linux, macOS, Docker, IoT devices, or cloud platforms, CodeSign Secure is ready to help you transition safely and efficiently.  

Enterprise Code-Signing Solution

Get One solution for all your software code-signing cryptographic needs with our code-signing solution.

Conclusion 

CNSA 2.0 is here, and it’s more than a recommendation, it’s a roadmap to enhance your security measures. If you’re involved in software development, infrastructure, or compliance, now’s the time to start planning.  

With CodeSign Secure, you get the tools and automation you need to:  

  • Start signing with CNSA 2.0-compliant algorithms
  • Protect your keys and enforce strict policies
  • Stay ahead of deadlines without slowing down development

Want to see how it works? 

Reach out to us at [email protected] to schedule a demo or learn more about how CodeSign Secure can help you stay compliant and secure. 

How CBOM Differs from SBOM and Why It’s Crucial for Industry

As digital infrastructures expand and the software supply chain becomes more intricate, the need for transparency has never been greater. Organizations are not just expected to know what software they’re using, but also how that software is protected. In this context, tools like the SBOM and CBOM are playing a crucial role in enhancing visibility, strengthening security, and supporting compliance efforts across industries. This demand has led to the widespread adoption of SBOM, and more recently, growing interest in CBOM. 

CBOM is best understood as an extension of SBOM , while SBOM catalogs the components that make up a piece of software, CBOM focuses entirely on cryptographic elements such as keys, certificates, algorithms, and crypto libraries. With regulators, auditors, and security teams demanding more visibility into cryptographic usage, CBOM is rapidly emerging as an essential piece of the cybersecurity puzzle.

What is SBOM?

A Software Bill of Materials (SBOM) is a detailed record of all software packages, libraries, modules, and dependencies included in an application or system. SBOM consists of: 

  • Software components 
  • Version numbers 
  • Third party information(Libraries or Frameworks) 
  • Licensing details 
  • Source repositories 
  • Interdependencies between components 
  • Security vulnerabilities such as CVE (Common Vulnerabilities & Exposures) 

The purpose of SBOM is to provide visibility into software makeup, much like an ingredient label does for food. It allows companies to track vulnerabilities, manage license compliance, and prepare for software supply chain attacks.

Why SBOM Matters Today?

In recent years, government mandates, especially the 2021 Executive Order on Cybersecurity in the U.S. have made SBOMs a requirement for federal software vendors. This move came after a wave of software supply chain attacks, where malicious or vulnerable components in widely used packages compromised thousands of systems. 

An SBOM helps companies: 

  • Identify and assess vulnerable libraries 
  • Ensure open-source compliance 
  • Accelerate incident response 
  • Improve vendor accountability 

However, while SBOMs offer essential visibility into software components, they often fall short when it comes to understanding how those components are secured. Most SBOMs do not capture details about cryptographic implementations such as what algorithms are used, how keys are managed, or whether expired certificates exist. This is where CBOM (Cryptographic Bill of Materials) becomes critical, providing deeper insight into the security posture of an application beyond just its components. 

So let’s dive into the concept of CBOM and understand why you need to implement it. 

What is a CBOM?

A Cryptographic Bill of Materials (CBOM) is a structured inventory that details every cryptographic asset in an organization’s software and systems. This includes encryption algorithms, digital keys, certificates, crypto protocols, and supporting libraries. 

Key Components of CBOM

A Cryptographic Bill of Materials (CBOM) provides deep visibility into how cryptography is implemented across software and systems. It tracks a wide range of security-relevant details, including: 

  • Cryptographic Algorithms: Documents the use of encryption and hashing algorithms such as AES, RSA, ECC, and SHA-256, helping assess algorithm strength and compliance relevance. This visibility becomes especially valuable when planning a transition to PQC(post-quantum cryptography), as CBOM helps identify legacy algorithms that may be vulnerable to quantum attacks. 
  • Key Information: Includes data on cryptographic key types (symmetric/asymmetric), key lengths, usage policies, and lifecycle stages, covering generation, storage, rotation, and destruction. 
  • Certificates: Tracks digital certificates in use (e.g., TLS/SSL certificates, code-signing certificates, client authentication certificates), their issuing authorities, and expiration timelines to avoid trust failures. 
  • Protocols: Details cryptographic communication protocols like TLS, SSH, and IPsec, ensuring secure data in transit and identifying outdated or misconfigured protocol usage. 
  • Cryptographic Libraries: Captures versions and implementations of crypto libraries such as OpenSSL, BoringSSL, BouncyCastle, and Microsoft CryptoAPI, key for patch management and vulnerability tracking. 
  • Algorithm Parameters: Specifies critical parameters like key sizes, modes of operation (e.g., CBC, GCM), padding schemes, and initialization vectors (IVs), which influence encryption effectiveness. 
  • Cryptographic Modules: Identifies hardware and software modules used for cryptographic operations, including Hardware Security Modules (HSMs), Trusted Platform Modules (TPMs), and smartcards. 
  • Validation & Compliance Status: Indicates whether the cryptographic components comply with standards such as FIPS 140-3, Common Criteria, or industry-specific requirements, crucial for regulated environments.

CBOM vs SBOM: Core Differences

FeatureSBOMCBOM
Focus Area Software modules and dependencies Cryptographic elements and controls 
Purpose Track vulnerabilities, licenses, and updates Secure encryption usage and enforce crypto hygiene 
Primary Users Developers, DevOps, AppSec teams Security architects, crypto owners, compliance leads 
Sample Entries Apache Struts, Log4j, React RSA-2048, TLS 1.2, expired x.509 cert, SHA-1 hash 
Regulatory Relevance Executive Order 14028, NTIA SBOM specs PCI DSS 4.0, NIST 800-57/175B, CNSA Suite 
Tooling Maturity Mature tools and standards (SPDX, CycloneDX) Emerging tools; standardization still in progress 
Use Cases Vulnerability and patch management – Software supply chain auditing – Open-source license tracking Cryptographic compliance (FIPS, PCI DSS) – Certificate lifecycle management – Encryption algorithm risk assessment 

Why Organizations Are Turning to CBOM?

  1. Encryption Is Everywhere, But Rarely Understood

    Encryption is embedded in nearly every aspect of modern IT, from web traffic to authentication to cloud storage. Despite this, most organizations lack a clear view of:

    • What cryptographic libraries are in use
    • Where and how encryption keys are deployed
    • Whether algorithms used are compliant and secure
    • Whether certificates are valid and rotated

    This lack of clarity creates a significant blind spot in security and compliance efforts. CBOM directly addresses this gap by enabling organizations to build and maintain a comprehensive cryptographic inventory. It brings much-needed visibility into how cryptography is implemented, helping security teams uncover weaknesses, reduce risk, and enforce best practices across the enterprise.

  2. Regulations Are Tightening Cryptographic Controls

    Security frameworks are getting stricter about encryption. For example:

    • PCI DSS v4.0 mandates strong encryption and proper key management across the cardholder data environment.
    • NIST 800-57 and 800-175B provide guidelines on key lifecycles and algorithm suitability.
    • NSA’s CNSA Suite defines minimum encryption standards for protecting sensitive federal data. It replaces the older Suite B Cryptography and outlines stronger cryptographic algorithms to be used across national security systems.

    CBOM helps by tracking algorithm usage, documenting key properties, and ensuring adherence to prescribed controls.

  3. Preparing for Post-Quantum Transition

    Quantum computing is poised to render traditional cryptography obsolete. Public-key methods like RSA and ECC will no longer be safe. Organizations will need to replace their crypto quickly. In simple words, global security standards are evolving to include PQC as a new class of cryptographic algorithms designed to resist attacks by quantum computers.

    But migrating to PQC is not that simple as most organizations do not have a clear understanding of where and how cryptographic algorithms are used across their systems. This lack of visibility poses a major risk when planning a transition.

    So before migration can happen, they must know:

    • Where RSA or ECC is being used
    • What libraries depend on them
    • What systems support post-quantum algorithms

    CBOM provides the foundational inventory needed to prepare for this transition, including:

    • Clarity on Cryptographic Dependencies: You can identify exactly where RSA, ECC, SHA-1, or other potentially deprecated algorithms are being used.
    • Awareness of Legacy Risks: Many systems still rely on outdated or weak cryptography. CBOM highlights these areas so they can be prioritized for replacement.
    • Readiness for Migration: Before implementing PQC, organizations must ensure their systems and libraries can support new algorithms. CBOM helps map that compatibility landscape.
    • Simplified Compliance: CBOM helps satisfy growing regulatory and industry demands for crypto transparency, such as PCI DSS v4.0 and upcoming PQC-readiness requirements.

  4. Better Response to Crypto Vulnerabilities

    When vulnerabilities like Heartbleed (OpenSSL) or Logjam (DH key exchange) surface, teams need immediate answers:

    • Is this vulnerability present in our environment?
    • Which systems are affected?
    • How quickly can we patch?

    Heartbleed was a critical vulnerability in certain versions of OpenSSL (2014) that allowed attackers to read sensitive memory content including private keys and passwords, by exploiting a flaw in the TLS heartbeat extension.

    Logjam, discovered in 2015, exploited weaknesses in the Diffie-Hellman (DH) key exchange by forcing servers to downgrade to weaker 512-bit encryption, making it easier for attackers to decrypt secure communications.

    In both cases, organizations scrambled to assess impact but lacked centralized visibility into where vulnerable libraries or algorithms were being used. CBOM addresses this gap by providing a clear, up-to-date inventory of cryptographic components across your systems, allowing teams to immediately locate and respond to affected assets. Instead of spending days manually scanning infrastructure, teams can pinpoint risk exposure in minutes.

PQC Advisory Services

Prepare for the quantum era with our tailored post-quantum cryptography advisory services!

Benefits of CBOM to the Industry

CBOM goes beyond checklist compliance. It enables foundational improvements across all aspects of security, risk management, and governance.

Enhanced Crypto Hygiene

CBOM helps eliminate: 

  • Weak algorithms like MD5 or SHA-1 
  • Deprecated protocols like SSLv3 or TLS 1.0 
  • Misused keys or certificate misconfigurations 

This improves the overall security posture and reduces the risk of compromise. 

Accelerated Incident Response

When vulnerabilities or zero-day threats emerge, having a CBOM means organizations can: 

  • Instantly locate affected systems 
  • Prioritize updates and patches 
  • Prove to auditors that steps were taken 

This reduces both downtime and reputational risk. 

Reduced Shadow Crypto

CBOM helps discover unauthorized or unknown cryptographic assets (“shadow crypto”) that may have been implemented by individual teams or legacy applications without central approval. 

By identifying and managing shadow crypto, organizations avoid using unvetted or insecure encryption methods. 

Improved Key Management

Key sprawl is a real problem in cloud-native environments. With CBOM, organizations can track: 

  • Key ownership 
  • Expiration timelines 
  • Usage scopes 
  • Compliance with key length and rotation policies 

Future-Proofing Against Post-Quantum Threats

CBOM is the foundation for any post-quantum transition plan. It allows teams to: 

  • Identify where traditional algorithms are used 
  • Plan replacement strategies with PQC algorithms 
  • Avoid last-minute remediation under pressure 

Challenges in Adopting CBOM

Despite its benefits, CBOM adoption comes with challenges:

  • Lack of Standards

    Unlike SBOM, which benefits from well-established formats like SPDX and CycloneDX, CBOM is still in its early stages. No universal format or tooling has emerged yet.

  • Tooling and Automation

    Identifying cryptographic elements is more difficult than identifying software components. Organizations need tools that can parse binaries, scan infrastructure, and detect crypto usage automatically.

  • Organizational Ownership

    Who owns cryptography? Security? DevOps? Engineering? Legal? Organizations must establish clear roles and processes for managing cryptographic inventories across teams.

Best Practices for Successful CBOM Adoption

Despite these challenges, organizations can make meaningful progress with the right strategy. Here are key best practices: 

  1. Start with High-Risk and High-Value Systems

    Prioritize CBOM discovery for systems that:

    • Handle sensitive or regulated data (e.g., financial transactions, PII)
    • Are internet-facing or business-critical
    • Have compliance mandates (e.g., PCI DSS, FedRAMP, FIPS)

    This allows organizations to focus efforts where crypto visibility matters most.

  2. Leverage Existing SBOM Infrastructure

    If you already generate SBOMs using tools like Syft, OWASP Dependency-Track, or CycloneDX:

    • Extend those pipelines to also extract cryptographic libraries
    • Map discovered libraries (e.g., OpenSSL, Bouncy Castle) to potential crypto usage
    • Integrate with external scanners that add crypto-layer insights

    This bridges the gap between SBOM and CBOM until native CBOM support becomes mainstream.

  3. Use Specialized Cryptographic Discovery Tools

    Adopt tools that can:

    • Parse binaries and source code to detect crypto APIs and algorithms
    • Inventory TLS configurations, certificates, and key stores
    • Monitor for usage of deprecated or weak algorithms (e.g., SHA-1, RSA-1024)
  4. Define Ownership and Governance

    Establish a Cryptography Working Group or designate a Crypto Steward responsible for:

    • Managing the cryptographic inventory (CBOM)
    • Defining crypto lifecycle policies (e.g., rotation, expiration)
    • Responding to crypto-related vulnerabilities or audits
    • Preparing for post-quantum transitions

    Formalizing ownership ensures crypto isn’t treated as an afterthought

  5. Integrate CBOM into Compliance and Audit Frameworks

    Map your CBOM efforts to requirements like:

    • PCI DSS 4.0 Requirement 12.3.3 (inventory and management of cryptographic assets)
    • NIST 800-53 / NIST 800-175B (crypto control baselines)
    • Zero Trust Architecture principles (explicit trust, asset validation)

    This reinforces CBOM as not just a technical tool, but a compliance and risk-management enabler.

As cryptography becomes central to cybersecurity, privacy, and regulatory compliance, the concept of CBOM is expected to evolve rapidly. Here are the key trends shaping the future of CBOM: 

Standardization and Interoperability

The future of CBOM heavily depends on the development of standardized formats and interoperable frameworks. Currently, there’s no universally accepted way to define or share a CBOM. However, organizations like NIST, ISO, and open-source communities are likely to drive efforts toward creating structured schemas and consistent formats.

Integration with SBOM Platforms

Instead of existing as isolated artifacts, CBOMs will likely be embedded within SBOMs or be tightly linked to them. As SBOM tools mature, many will begin including cryptographic metadata as part of their default output. This integration will provide a more holistic view of software security by detailing both software components and the cryptographic methods securing them.

Automation and CI/CD Integration

As environments grow more complex, manual generation of CBOMs will no longer be practical. The industry is moving toward automated discovery of cryptographic assets, including embedded libraries, algorithms, and keys. These tools will be integrated into CI/CD pipelines, enabling automatic CBOM generation during the software build or deployment process.

Post-Quantum Cryptography (PQC) Readiness

With the advent of quantum computing, organizations must prepare to transition to post-quantum cryptographic algorithms. Future CBOMs will play a key role in identifying where vulnerable algorithms like RSA and ECC are used. CBOMs will include data that shows whether an algorithm is quantum-safe or considered at risk, helping security teams prioritize upgrades and remediation well before quantum attacks become practical.

Regulatory and Compliance Adoption

As regulatory frameworks evolve, CBOM is poised to become a mandatory part of compliance documentation. Standards like PCI DSS v4.0, FIPS 140-4, and the EU’s Cyber Resilience Act are already beginning to emphasize cryptographic controls. We’ve already talked about one of the important requirement of PCI DSS on CBOM in our previous blog.

PQC Advisory Services

Prepare for the quantum era with our tailored post-quantum cryptography advisory services!

In the near future, audits may specifically require evidence of cryptographic inventories, making CBOMs essential for demonstrating compliance with emerging cybersecurity laws and industry standards. 

How can Encryption Consulting help?

In this blog you have heard enough about PQC readiness, don’t worry we’re here to help with that. We provide PQC Advisory Services to prepare you for the quantum era ahead and help tranistioning with PQC aswell. Before you can migrate to post-quantum algorithms, you need a clear, accurate inventory of all cryptographic assets in your environment. Our Compliance Services help you build this inventory, by thoroughly analyzing your infrastructure and identifying gaps, giving you the visibility needed to move forward with confidence.

Here’s how we support your transition and CBOM readiness:

  • Comprehensive Cryptographic Inventory: We identify where encryption is used, what algorithms and libraries are involved, and how cryptographic elements are configured in your environment. 
  • Risk Identification: Our analysis pinpoints deprecated algorithms, weak configurations, expired certificates, and undocumented cryptographic implementations. 
  • PQC Transition Planning: We help map your current usage of RSA, ECC, and other vulnerable schemes to prepare a phased migration plan aligned with NIST’s post-quantum recommendations. 
  • Expert Support: Our team of experts are here to guide you through every challenge, from inventory creation to PQC migration and beyond. 

Conclusion

SBOM has laid the groundwork for transparency in the software supply chain. Now, CBOM is stepping in to do the same for cryptography. As encryption becomes more critical to data security, compliance, and privacy, understanding and managing your cryptographic landscape is no longer optional, it’s essential. 

CBOM enables organizations to: 

  • Map their cryptographic usage 
  • Identify and eliminate weak or risky implementations 
  • Prove compliance with emerging standards 
  • Prepare for the transition to post-quantum cryptography 

While CBOM is still a maturing concept, it’s quickly becoming a strategic requirement for any security-conscious enterprise. In a digital world where encryption is everywhere, knowing what cryptography you use and how you use it, could be the key to protecting your most valuable assets.