Skip to content

How Are Digital Signatures Different From Electronic Signatures?

In this discussion whiteboard, let us understand what is an e-signature? What is digital signature? What is meant by electronic signature? Are both the signatures similar or different? Which signature is more secure and what are various use cases for digital signature as well as electronic signatures? How is code signing relevant to digital signature? What is Encryption Consulting’s CodeSign Secure and how is it relevant to your organization? Let’s get into the topic to understand responses to these questions:

If you are new to the concept of e-signatures then there are high chances of getting confused between “Digital signature” and “Electronic signature”. Quite often you would encounter people use both digital signature and electronic signature terms interchangeably which is not completely true as there are some key significant differences between these two types of e-signatures.

The major difference is security – digital signatures are mainly used to secure documentation and provide authorization as they are authorized by Certificate Authorities (CAs) where as electronic signatures only provide the intent of the signer. Let us first understand what is a digital signature and electronic signature.

What is a Digital Signature?

Digital signature is a type of electronic signature as the both are meant to be used of document signing except that digital signatures are more secure and authentic. In digital signature, the signer of the document is mandated to have a Public Key Infrastructure (PKI) based digital certificate authorized by certificate authority linked to the document. This provides authenticity to the document as it is authorized by trusted certificate authorities.

Let us understand in a simple way about digital signature by taking paper based documents as example. There are usually two concerns when you involve in documentation process, one is the authenticity of the person signing the contract and other is whether the document integrity is protected without any tampering. To overcome these concerns we have notaries in place for providing authorization and safeguarding integrity of the document.

Similar to the notary in physical contracts we have certificate authorities (CAs) authorizing digital signatures with PKI based digital certificates. In digital signatures, a unique fingerprint is formed between the digital document and the PKI based digital certificate which is leveraged to achieve the authenticity of the document and its source, assurance of tamper proof document.

Currently there are two major document processing platforms which provide digital signature service with strong PKI based digital certificates:

  • Adobe Signature

    There are two types of signatures provided by Adobe – Certified and Approval signatures. Certificate signature is used for authentication purpose where a blue ribbon is displayed in the top of the document indicating the actual author of the document and issuer of PKI based digital certificate. Approval signature on the other hand captures the physical signature of the issuer or author and other significant details.

  • Microsoft Word Signature

    Microsoft supports two types of signatures one is visible signature and other is invisible signature. In visible signature, there is a signature field provided for signing similar to physical signature. Invisible signature is more secure as it cannot be accessed or tampered by unauthorized users. Invisible signature is commonly used for document authentication and enhanced security.

What is electronic signature?

An electronic signature is not as secure and complex as digital signature as there are no PKI based certificates involved. Electronic signature is mainly used to identify the intent of the document issuer or author and it can be in any form such as electronic symbol or process. Electronic signature can be captured in as simple way as check box as its primary purpose is to capture the intention to sign contract or document. These signatures are also legally binding. In instances where the document is required to be signed by two parties for binding legally to execute certain duties and do not require high level of security and authorization electronic signatures are used instead of digital signatures.

Key differences between digital signature and electronic signature

Let us understand the key differences between the two signatures by comparing the crucial parameters in a tabular form.

ParameterDigital SignatureElectronic Signature
PurposeMain purpose is to secure the document or contract through PKI based digital certificatePurpose of electronic signature is to verify the document or contract
AuthorizationYes. Digital signatures can be validated and verified by certificate authorities providing PKI certificatesNo. Usually it is not possible to authorize electronic signatures
SecurityComprises of better security features due to digital certificate based authorizationComprises of less number of security features compared to digital signature
Types of SignsIn general two types are available. One by Adobe and other by MicrosoftMain types of electronic signatures are verbal, scanned physical signatures, e-ticks
VerificationYes. Digital signatures can be verifiedNo. Electronic signatures cannot be verified
FocusPrimary focus is to secure the document or contractPrimary focus is to show intention of signing a document or contract
BenefitsPreferred majorly more than electronic signature due to high level of securityEasy to use compared to digital signature but less secure

As per the above comparison it is clearly evident that digital signature takes upper hand compared to electronic signatures. However, while considering the legally binding objective both the signatures will serve the purpose. Digital signatures are now highly preferred due to their enhanced security through PKI based certificates which will provide the much required authorization and integrity of the document.

Enterprise PKI Services

Get complete end-to-end consultation support for all your PKI requirements!

What is Code Signing?

Code signing is the process of applying a digital signature to any software program that is intended for release and distribution to another party or user, with two key objectives. One is to prove the authenticity and ownership of the software. The second is to prove the integrity of the software i.e. prove that the software has not been tampered with, for example by the insertion of any malicious code. Code signing applies to any type of software: executables, archives, drivers, firmware, libraries, packages, patches, and updates. An introduction to code signing has been provided in earlier articles on this blog. In this article, we look at some of the business benefits of signing code.

Code signing is a process to validate the authenticity of software and it is one type of digital signature based on PKI. Code signing is a process to confirm the authenticity and originality of digital information such as a piece of software code. It assures users that this digital information is valid and establishes the legitimacy of the author. Code signing also ensures that this piece of digital information has not changed or been revoked after it was validly signed. Code Signing plays an important role as it can enable identification of a legitimate software versus malware or rogue code. Digitally signed code ensures that the software running on computers and devices is trusted and unmodified.

Software powers your organization and reflects the true value of your business. Protecting the software with a robust code signing process is vital without limiting access to the code, assuring this digital information is not malicious code and establishing the legitimacy of the author.

Encryption consulting’s (EC) CodeSign Secure platform

Encryption consulting (EC) CodeSign secure platform provides you with the facility to sign your software code and programs digitally. Hardware security modules (HSMs) store all the private keys used for code signing and other digital signatures of your organization. Organizations leveraging CodeSign Secure platform by EC can enjoy the following benefits:

  • Easy integration with leading Hardware Security Module (HSM) vendors
  • Authorized users only access to the platform
  • Key management service to avoid any unsafe storage of keys
  • Enhanced performance by eliminating any bottlenecks caused

Enterprise PKI Services

Get complete end-to-end consultation support for all your PKI requirements!

Why to use EC’s CodeSign Secure platform?

There are several benefits of using Encryption consulting’s CodeSign Secure for performing your code sign operations. CodeSign Secure helps customers stay ahead of the curve by providing a secure Code Signing solution with tamper proof storage for the keys and complete visibility and control of Code Signing activities. The private keys of the code-signing certificate can be stored in an HSM to eliminate the risks associated with stolen, corrupted, or misused keys.

Client-side hashing ensures build performance and avoids unnecessary movement of files to provide a greater level of security.

Client-side hashing ensures build performance and avoids unnecessary movement of files to provide a greater level of security.

Client-side hashing ensures build performance and avoids unnecessary movement of files to provide a greater level of security. Seamless authentication is provided to code signing clients via CodeSign Secure platform to make use of state-of-the-art security features including client-side hashing, multi-factor authentication, device authentication, and as well as multi-tier approvers workflows, and more. Support for InfoSec policies to improve adoption of the solution and enable different business teams to have their own workflow for Code Signing. CodeSign Secure is embedded with a state-of-the-art client-side hash signing mechanism resulting in less data travelling over the network, making it a highly efficient Code Signing system for the complex cryptographic operations occurring in the HSM.

Explore more about our CodeSign Secure platform features and benefits in the below link:

CodeSigning Solution

Use cases covered as part of Encryption Consulting’s CodeSign Secure platform

There are multiple use cases that can be implemented using CodeSign Secure platform by Encryption Consulting. Majority of the use cases can be relevant to digital signature concept discuss above. CodeSign Secure platform will cater to all round requirements of your organization. Let us look into some of the major use cases covered under Encryption Consulting’s CodeSign Secure:

  • Code Signing:

    Sign code from any platform, including Apple, Microsoft, Linux, and much more.

  • Document Signing:

    Digitally sign documents using keys that are secured in your HSMs.

  • Docker Image Signing:

    Digital fingerprinting to docker images while storing keys in HSMs.

  • Firmware Code Signing:

    Sign any type of firmware binaries to authenticate the manufacturer to avoid firmware code tampering.

Organizations with sensitive data, patented code/programs can benefit from CodeSign Secure platform. Online distribution of the software is becoming de-facto today considering the speed to market, reduced costs, scale, and efficiency advantages over traditional software distribution channels such as retail stores or software CDs shipped to customers.

Code signing is a must for online distribution. For example, third party software publishing platforms increasingly require applications (both desktop as well as mobile) to be signed before agreeing to publish them. Even if you are able to reach a large number of users, without code signing, the warnings shown during download and install of unsigned software are often enough to discourage the user from proceeding with the download and install.

Encryption Consulting will provide strongly secured keys in FIPS certified encrypted storage systems (HSMs) during the code signing operation. Faster code signing process can be achieved through CodeSign secure as the signing occurs locally in the build machine. Reporting and auditing features for full visibility on all private key access and usage to InfoSec and compliance teams.

Get more information on CodeSign Secure in the datasheet link provided below:

Code-Signing-Datasheet.pdf

Which signature to use for your organization?

This solely depends on the purpose and intent of using the signature for your organization. You might need to perform a clear assessment or approach expert consultants like us – Encryption consulting to understand which certificate will suit your purpose better.

Encryption Consulting’s Managed PKI

Encryption Consulting LLC (EC) will completely offload the Public Key Infrastructure environment, which means EC will take care of building the PKI infrastructure to lead and manage the PKI environment (on-premises, PKI in the cloud, cloud-based hybrid PKI infrastructure) of your organization.

Encryption Consulting will deploy and support your PKI using a fully developed and tested set of procedures and audited processes. Admin rights to your Active Directory will not be required and control over your PKI and its associated business processes will always remain with you. Furthermore, for security reasons the CA keys will be held in FIPS140-2 Level 3 HSMs hosted either in in your secure datacentre or in our Encryption Consulting datacentre in Dallas, Texas.

Conclusion

Encryption Consulting’s PKI-as-a-Service, or managed PKI, allows you to get all the benefits of a well-run PKI without the operational complexity and cost of operating the software and hardware required to run the show. Your teams still maintain the control they need over day-to-day operations while offloading back-end tasks to a trusted team of PKI experts.

New Major Ransomware Attack Strikes IT Solutions Provider, Kaseya

Another major ransomware supply chain attack has occurred over the holiday weekend. On July 2nd, the IT Solutions Provider Kaseya issued a statement saying they had suffered a ransomware attack. This attack only affected 0.1% of Kaseya’s customers, but their customers are Managed Service Providers (MSPs), which means hundreds of smaller businesses were also affected by this ransomware attack. This attack follows in the wake of several other large ransomware attacks in the past few months, including the Colonial Gas Pipeline attack and the attack on the meat supplier JBS. Before we get the specifics on this attack, let’s first learn about who Kaseya are and what a ransomware attack is.

What is Kaseya?

Kaseya is an IT solutions provider who offers different software to Managed Service Providers and enterprises. These MSPs in turn offer their own services to other small customers, such as Software as a Service, PKI as a Service, and other similar services. This is one of the reasons that this attack was so effective, as each of these MSPs have several hundred small companies of their own that they accidentally affected with this ransomware. An example of the software that Kaseya provides is VSA, which is used to monitor and manage networks and endpoints.

What is ransomware?

Ransomware is a type of malware which encrypts all the files in a victim’s system. Once the files are encrypted, the threat actors normally leave a ransom note, telling the victim how much and where to send the ransom, while they in turn send the decryption key back to the victim. It is recommended to never pay the ransom to a threat actor who has encrypted your data, as they can either not give you the encryption key, they can download the information anyways and blackmail you in the future, or they may not even know how to decrypt it.

Tailored Encryption Services

We assess, strategize & implement encryption strategies and solutions.

What happened in this attack?

On July 2nd, 2021, Kaseya announced that an attack had hit their tool, the VSA, and affected “a small number of on-premise customers.” Even though only a small number of customers were affected, that is still a significant number of victims. As we previously mentioned, many of the tools created by Kaseya are utilized by MSPs, and thus their clients were affected as well. Victims were recommended by Kaseya to shut off admin access to the hijacked tool, and they also pulled their SaaS servers and data centers offline.
The attack itself manipulated a vulnerability within Kaseya’s VSA tool where the attackers used an authentication bypass vulnerability within the tool’s web interface to distribute their malware. This let the threat actors get around security controls, upload their payload, and use SQL injection to execute their code within the VSA tool. To do this, the attackers utilized a rogue certificate. Once the endpoint of the MSP or user was infected, the endpoint would write a file into its working directory. From there, the machine would then run a number of PowerShell commands which work to stop and turn off a number of malware services on a Windows computer. The file in the working directory is then turned into an executable file, thus releasing the ransomware.

However, to use the executable file, a legitimate signature was still needed, which is where the rogue certificate comes in. The certificate was found to belong to an organization called PB03transport, which is a legitimate organization. This indicates that the threat actors had access to the private key of this organization, most likely obtained via phishing or a Man in the Middle attack. Once the ransomware infected an MSP, the malware was then given to other customers through an automated update containing the ransomware. The ransomware in question is called REvil ransomware and was uploaded to the VSA tool by the creators, the threat actors known as REvil or Sodinikibi. It is unknown at this time if the victims have all paid the attackers.

Stopping this Type of Attack

The sad truth of this attack is that it could have been prevented. Utilizing a rogue certificate, these threat actors crippled thousands of companies, when proper certificate management could have stopped this. Using a managed certificate management system or PKI-as-a-Service, like the kind Encryption Consulting offers, this rogue certificate would not have been created in the first place. With proper certificate monitoring and key inventorying, the stolen key could have been detected and subsequently deactivated. Instead, many companies may have to pay a ransom just to get their data back.

Aligning to the NIST Cybersecurity Framework in Google Cloud

As defined by the U.S. Patriot Act of 2001, critical infrastructure includes “systems and assets, whether physical or virtual, so vital to the United States that the incapacity or destruction of such assets would have a debilitating impact on security, national economic security, national public health or safety, or any combination of those matters.”

In response to this Executive Order, the Cybersecurity Enhancement Act of 2014 (CEA) identified the National Institute of Standards and Technology (NIST) as the leader in facilitating and supporting the development of cybersecurity risk frameworks. The NIST would formalized the Cybersecurity Framework (CSF) – a consistent, iterative approach for identifying, assessing, and managing cybersecurity risk.

The NIST Cybersecurity Framework provides a standard mechanism for organizations to

  1. Describe their current cybersecurity posture.
  2. Describe their target state for cybersecurity.
  3. Identify and prioritize a continuous, repeatable process for reaching the target cybersecurity state.
  4. Assess progress toward the target state.
  5. Communicate cybersecurity risks to internal and external stakeholders

NIST Cybersecurity Framework: Functions

NIST generalizes cybersecurity activities into five core functions: Identify, Protect, Detect, Respond, and Recover. These functions help guide organizations in mapping out the management of cybersecurity risks. Organizations should perform these functions concurrently, continuously, and regularly to establish an operational culture for dynamically addressing cybersecurity risks.

Identify Develop an organizational understanding to manage cybersecurity risk to systems, people, assets, data, and capabilities. Functions include Asset Management, Governance, Business Environment, Risk Assessment, and Risk Management Strategy
Protect Develop and implement appropriate safeguards to ensure the delivery of critical services. Functions include Identity & Access Management Control, Awareness & Training, Data Security, Maintenance, Protective Technologies, Information Protection Processes & Procedures.
Detect Detect and implement appropriate activities to identify the occurrence of a cybersecurity event. Functions include Anomalies & Events, Security Continuous Monitoring, and Detection Processes
Respond Develop and implement appropriate activities to take action regarding a detected cybersecurity incident. Functions include Response Planning, Communications, Analysis, Mitigation, and Improvements.
Recover Develop and implement appropriate activities to maintain resilience plans and restore any capabilities or services that were impacted due to a cybersecurity incident. Functions include Recovery Planning, Improvements, and Communications.

Tailored Cloud Key Management Services

Get flexible and customizable consultation services that align with your cloud requirements.

NIST Cybersecurity Framework: Categories

Each NIST CSF function spans multiple categories, which outline the components of the function. These categories cover the cybersecurity risk management areas that organizations should implement. When adopting new technology, including Google Cloud, organizations should leverage products and services that meet the requirements for each of the following categories:

IDENTIFY PROTECT DETECT RESPOND RECOVER
Asset Management Identity and Access Control Anomalies and Events Response Planning Recovery Planning
Business Environment Awareness and Training Security Continuous Monitoring Communications Improvements
Governance Data Security Detection Processes Analysis Communications
Risk Assessment Information Protection Processes & Procedures Mitigation
Risk Management Strategy Maintenance Improvements
Supply Chain Risk Management Protective Technology

Subcategories

Further detailing cybersecurity implementation considerations, each category of the NIST CSF has subcategory items that define the risks that should be assessed for each topic. Selecting technologies and cloud service providers that can meet these subcategoryy needs is key to effectively leveraging the NIST CSF. Each subcategory and related Google Cloud products, methodologyies, and services that can help meet these requirements will be outlined in the next section.

Implementing NIST CSF on Google Cloud

This section outlines each category and subcategories subcategory of the NIST Cybersecurity Framework. Corresponding to each NIST CSF category and subcategory, recommendations on meeting and implementing these requirements in Google Cloud are mapped accordingly. Organizations can leverage some or all of the suggested components to define, enforce, and manage cloud security and compliance.

Identify

Asset Management

  • Physical devices and systems within the organization are inventoried
    • Cloud Identity
    • Google Admin Console
    • Cloud Resource Manager: Cloud Asset Inventory
    • Forseti Security: Asset Inventory
    • Cloud Security Command Center (CSCC)
  • Software platforms and applications within the organization are inventoried
    • Cloud Resource Manager: Cloud Asset Inventory
    • Forseti Security: Asset Inventory
    • Cloud Security Command Center (CSCC)
    • Cloud Data Catalog
    • Cloud Private Catalog
  • Organizational communication and data flows are mapped
    • Cloud Resource Manager
    • Cloud Identity & Access Management
  • External information systems are cataloged
    • Identity Platform
  • Resources (e.g., hardware, devices, data, time, personnel, and software) are prioritized based on their classification, criticality, and business value.
    • Cloud Resource Manager
    • Cloud Identity & Access Management
  • Cybersecurity roles and responsibilities for the entire workforce and third-party stakeholders (e.g., suppliers, customers, partners) are established
    • Cloud Identity & Access Management
    • Cloud Identity
    • Google Admin Console

Business Environment

  • The organization’s role in the supply chain is identified and communicated
    • Google Cloud Adoption Framework
    • Professional Services: Transformation Advisory
    • Professional Services: Change Management Advisory
  • The organization’s place in critical infrastructure and its industry sector is identified and communicated
    • Google Cloud Adoption Framework
    • Professional Services: Transformation Advisory
    • Professional Services: Change Management Advisory
  • Priorities for organizational mission, objectives, and activities are established and communicated
    • Google Cloud Adoption Framework
    • Professional Services: Transformation Advisory
    • Professional Services: Change Management Advisory
  • Dependencies and critical functions for the delivery of essential services are established
    • Google Cloud Services Overview
    • Google Cloud Services Overview

Governance

  • Organizational cybersecurity policy is established and communicated
    • Cloud Security Command Center (CSCC)
    • Forseti Security
    • Cloud Identity & Access Management
  • Cybersecurity roles and responsibilities are coordinated and aligned with internal roles and external partners
    • Cloud Identity & Access Management
    • Identity Platform
  • Legal and regulatory requirements regarding cybersecurity, including privacy and civil liberties obligations, are understood and managed
    • Google’s Security & Trust Center
  • Governance and risk management processes address cybersecurity risks
    • Professional Services: Cloud Discover Security
    • Policy Intelligence

Risk Assessment

  • Asset vulnerabilities are identified and documented
    • Cloud Security Scanner
    • Container Registry Vulnerability Scanner: Container Analysis
    • Cloud Armor
    • Phishing Protection
  • Cyber threat intelligence is received from information sharing forums and sources
    • Forseti Security
    • Cloud Security Command Center (CSCC)
  • Threats, both internal and external, are identified and documented
    • G Suite Security Center
    • Cloud Operations Suite
    • Cloud Security Command Center (CSCC)
  • Potential business impacts and likelihoods are identified
    • Cloud Security Command Center (CSCC)
    • G Suite Security Assessment
  • Threats, vulnerabilities, likelihoods, and impacts are used to determine risk
    • Forseti Security
    • Cloud Security Command Center (CSCC)

Risk Management

  • Risk management processes are established, managed, and agreed to by organizational stakeholders
    • Google Cloud Adoption Framework
    • Forseti Security
    • Cloud Security Command Center (CSCC)
  • The organization’s determination of risk tolerance is informed by its role in critical infrastructure and sector-specific risk analysis
    • Forseti Security
    • Cloud Security Command Center (CSCC)
    • G Suite Security Center
    • Policy Intelligence

Supply Chain Risk Management

  • Cyber supply chain risk management processes are identified, established, assessed, managed, and agreed to by organizational stakeholders
    • Must be implemented by the organization
  • Suppliers and third-party partners of information systems, components, and services are identified, prioritized, and assessed using a cyber supply chain risk assessment process
    • Identity Platform
  • Contracts with suppliers and third-party partners are used to implement appropriate measures designed to meet the objectives of an organization’s cybersecurity program and Cyber Supply Chain Risk Management Plan.
    • Must be implemented by the organization
  • Suppliers and third-party partners are routinely assessed using audits, test results, or other forms of evaluation to confirm they are meeting their contractual obligations.
    • Must be implemented by the organization
  • Response and recovery planning and testing are conducted with suppliers and third-party providers
    • Must be implemented by the organization

Protect

Identity Management Authentication and Access Control

  • Identities and credentials are issued, managed, verified, revoked, and audited for authorized devices, users, and processes
    • Cloud Identity & Access Management
    • Cloud Identity
    • Google Admin Console
  • Physical access to assets is managed and protected
    • Cloud Identity & Access Management
    • VPC Service Controls
    • Cloud Identity Aware Proxy
    • Forseti Security
  • Remote access is managed
    • Cloud Identity Aware Proxy
    • Cloud VPN
    • Context-Aware Access
  • Access permissions and authorizations are managed, incorporating the principles of least privilege and separation of duties
    • Cloud Identity & Access Management
    • Identity Platform
  • Network integrity is protected (e.g., network segregation, network segmentation)
    • Cloud VPC
    • Cloud Resource Manager
  • Identities are proofed and bound to credentials and asserted in interactions
    • Cloud Identity
    • Google Admin Console
    • Identity Platform
  • Users, devices, and other assets are authenticated (e.g., single-factor, multifactor) commensurate with the risk of the transaction (e.g., individuals’ security and privacy risks and other organizational risks)
    • Cloud Identity & Access Management
    • Cloud Identity
    • Google Admin Console
    • Identity Platform

Awareness and Training

  • All users are informed and trained
    • Google Cloud Training
  • Privileged users understand their roles and responsibilities
    • Cloud Identity & Access Management
    • Cloud Identity
  • Third-party stakeholders (e.g., suppliers, customers, partners) understand their roles and responsibilities
    • Identity Platform
  • Senior executives understand their roles and responsibilities
    • Google Cloud Adoption Framework
    • Professional Services: Transformation Advisory
    • Professional Services: Change Management Advisory
  • Physical and cybersecurity personnel understand their roles and responsibilities
    • Cloud Identity & Access Management
    • Cloud Identity

Data Security

  • Data-at-rest is protected
    • Google Encryption at Rest
    • Cloud Key Management Service
    • Customer Supplied Encryption Keys (CSEKs)
    • Cloud HSM
  • Data-in-transit is protected
    • Google Encryption in Transit
  • Assets are formally managed throughout removal, transfers, and disposition
    • Cloud Resource Manager
    • Cloud Private Catalog
    • Cloud Data Catalog
  • Adequate capacity to ensure availability is maintained
    • GCP Quotas
    • Autoscaling
  • Protections against data leaks are implemented
    • Cloud Data Loss Prevention
    • Phishing Protection
    • Access Approval API
    • VPC Service Controls
  • Integrity checking mechanisms are used to verify software, firmware, and information integrity
    • Titan Security Key
    • Shielded VMs
    • reCAPTCHA Enterprise
    • Binary Authorization
  • The development and testing environment(s) are separate from the production environment
    • GKE Sandbox
    • Cloud Resource Manager
  • Integrity checking mechanisms are used to verify hardware integrity
    • Titan Security Key
    • Shielded VMs

Information Protection Processes and Procedures

  • A baseline configuration for information technology/industrial control systems is created and maintained incorporating security principles (e.g., the concept of most minor functionality)
    • Forseti Security
    • Cloud Security Command Center (CSCC)
    • Policy Intelligence
    • Cloud Deployment Manager
  • A System Development Life Cycle to manage systems is implemented
    • Cloud Deployment Manager
    • Binary Authorization
  • Configuration change control processes are in place
    • Access Approval API
    • Binary Authorization
  • Backups of information are conducted, maintained, and tested
    • Google Cloud Storage
  • Policy and regulations regarding the physical operating environment for organizational assets are met
    • Must be implemented by the organization
  • Data is destroyed according to policy
    • Google Cloud Data Deletion
    • Protection processes are improved
    • Policy Intelligence
    • Cloud Security Command Center (CSCC)
    • G Suite Security Assessment
  • The effectiveness of protection technologies is shared
    • Forseti Security
    • Cloud Security Command Center (CSCC)
  • Response plans (Incident Response and Business Continuity) and recovery plans (Incident Recovery and Disaster Recovery) are in place and managed
    • Incident Response Management
  • Response and recovery plans are tested
    • Incident Response Management
    • Google Cloud Disaster Recovery Planning Guide
  • Cybersecurity is included in human resources practices (e.g., deprovisioning, personnel screening)
    • Cloud Identity & Access Management
    • Cloud Operations Suite
  • A vulnerability management plan is developed and implemented
    • Forseti Security
    • Cloud Operations Suite
    • Cloud Security Command Center (CSCC)

Maintenance

  • Maintenance and repair of organizational assets are performed and logged, with approved and controlled tools
    • Cloud Identity & Access Management
    • Cloud Identity
    • Google Admin
    • Console
    • Cloud Operations Suite
  • Remote maintenance of organizational assets is approved, logged, and performed in a manner that prevents unauthorized access
    • Identity Platform
    • Cloud Identity Aware Proxy
    • VPC Service Controls
    • Cloud VPC
    • Cloud Operations Suite

Protective technology

  • Audit/log records are determined, documented, implemented, and reviewed per policy
    • Cloud Operations Suite
    • Forseti Security
    • Cloud Security Command Center (CSCC)
  • Removable media is protected, and its use restricted according to policy
    • Cloud Identity & Access Management
  • The principle of most minor functionality is incorporated by configuring systems to provide only essential capabilities
    • Cloud Identity & Access Management
  • Communications and control networks are protected
    • Cloud VPC
    • VPC Service Controls
    • Cloud VPN
    • Cloud Armor
  • Mechanisms (e.g., failsafe, load balancing, hot-swap) are implemented to achieve resilience requirements in every day and adverse situations
    • Global, Regional, Zonal Resources
    • Google Cloud Load Balancing
    • Cloud CDN
    • Autoscaling
    • Google Deployment Manager

Detect

Anomalies and Events

  • A baseline of network operations and expected data flows for users and systems is established and managed
    • Cloud VPC
    • Traffic Director
    • VPC Service Controls
  • Detected events are analyzed to understand attack targets and methods
    • Cloud Armor
    • G Suite Phishing & Malware Protection
    • Network Telemetry
    • Incident Response Management
    • Cloud Operations Suite
    • Cloud Security Scanner
    • Container Registry Vulnerability Scanner: Container Analysis
  • Event data is collected and correlated from multiple sources and sensors
    • Cloud Operations Suite
    • Cloud Security Command Center (CSCC)
    • G Suite Security Center
  • The impact of events is determined
    • Cloud Security Command Center (CSCC)
    • G Suite Security Center
  • Incident alert thresholds are established
    • Incident Response Management
    • Cloud Operations Suite

Security Continuous Monitoring

  • The network is monitored to detect potential cybersecurity events
    • Network Telemetry
    • Cloud Armor
    • VPC Service Controls
    • Traffic Director
  • The physical environment is monitored to detect potential cybersecurity events
    • Cloud Operations Suite
    • G Suite Security Center
    • Cloud Security Command Center (CSCC)
  • Personnel activity is monitored to detect potential cybersecurity events
    • Cloud Operations Suite
    • Malicious code is detected
    • Cloud Security Scanner
    • Container Registry Vulnerability Scanner: Container Analysis
  • Unauthorized mobile code is detected
    • Android Enterprise
    • Cloud Security Scanner
    • Container Registry Vulnerability Scanner: Container Analysis
  • External service provider activity is monitored to detect potential cybersecurity events
    • Cloud Operations Suite
    • Identity Platform
  • Monitoring for unauthorized personnel, connections, devices, and software is performed
    • Cloud Operations Suite
    • Cloud Security Command Center (CSCC)
    • Cloud Identity
    • Google Admin Console
    • Identity Platform
  • Vulnerability scans are performed
    • Cloud Armor
    • Container Registry Vulnerability Scanner: Container Analysis
    • Cloud Security Scanner

Detection Processes

  • Roles and responsibilities for detection are well defined to ensure accountability
    • Cloud Identity & Access Management
    • Cloud Identity
    • Google Admin Console
    • Identity Platform
  • Detection activities comply with all applicable requirements
    • Cloud Operations Suite
    • G Suite Security Center
    • Cloud Security Command Center (CSCC)
  • Detection processes are tested
    • Google’s Security & Trust Center
  • Event detection information is communicated
    • Event Threat Detection
    • Cloud Security Command Center (CSCC)
    • Cloud Pub/Sub
    • G Suite Security Center
    • Cloud Functions
  • Detection processes are continuously improved
    • Policy Intelligence
    • Cloud Security Command Center (CSCC)

Respond

Response Planning

  • A response plan is executed during or after an incident
    • Incident Response Management
    • G Suite Security Center
    • Cloud Security Command Center (CSCC)

Communications

  • Personnel know their roles and order of operations when a response is needed
    • Cloud Identity & Access Management
    • Cloud Identity
    • Google Admin Console
    • Identity Platform
  • Incidents are reported consistent with established criteria
    • Incident Response Management
    • Cloud Operations Suite
  • Information is shared consistently with response plans
    • Log Exports
  • Coordination with stakeholders occurs consistently with response plans
    • Incident Response Management
  • Voluntary information sharing occurs with external stakeholders to achieve broader cybersecurity situational awareness
    • Identity Platform
    • Incident Response Management
    • Cloud Identity & Access Management

Analysis

  • Notifications from detection systems are investigated
    • Cloud Security Command Center (CSCC)
    • G Suite Security Center
    • Cloud Operations Suite
  • The impact of the incident is understood
    • G Suite Security Center
    • Incident Response Management
    • Cloud Security Command Center (CSCC)
  • Forensics are performed
    • Cloud Security Command Center (CSCC)
    • Log Exports
    • BigQuery
  • Incidents are categorized consistently with response plans
    • Incident Response Management
  • Processes are established to receive, analyze and respond to vulnerabilities disclosed to the organization from internal and external sources (e.g., internal testing, security bulletins, or security researchers)
    • Cloud Security Command Center (CSCC)
    • G Suite Security Center
    • Event Threat Detection
    • Forseti Security

Mitigation

  • Incidents are contained
    • Incident Response Management
    • Event Threat Detection
  • Incidents are mitigated
    • Cloud Security Scanner
    • Cloud Armor
    • Container Registry Vulnerability Scanner: Container Analysis
    • Phishing Protection
  • Newly identified vulnerabilities are mitigated or documented as accepted risks
    • Cloud Security Command Center (CSCC)
    • G Suite Security Center
    • Cloud Security Scanner
    • Cloud Armor
    • Container Registry Vulnerability Scanner: Container Analysis
    • Phishing Protection

Improvements

  • Response plans incorporate lessons learned
    • Incident Response Management
    • Event Threat Detection
  • Response strategies are updated
    • Cloud Security Command Center (CSCC)
    • Forseti Security
    • G Suite Security Center

Recover

Recovery Planning

  • A recovery plan is executed during or after a cybersecurity incident
    • Google Cloud Disaster Recovery Planning Guide
    • Global, Regional, Zonal Resources
    • Google Cloud Load Balancing
    • Cloud CDN
    • Autoscaling
    • Google Deployment Manager
    • Incident Response Management

Improvements

  • A recovery plan is executed during or after a cybersecurity incident
    • Google Cloud Disaster Recovery Planning Guide
    • Global, Regional, Zonal Resources
    • Google Cloud Load Balancing
    • Cloud CDN
    • Autoscaling
    • Incident Response Management
    • Google Deployment Manager
  • Recovery strategies are updated
    • Google Cloud Disaster Recovery Planning Guide
    • Global, Regional, Zonal Resources
    • Incident Response Management
    • Google Deployment Manager

Communications

  • Public relations are managed
    • Contact Center AI
  • Reputation is repaired after an incident
    • Must be implemented by the organization
  • Recovery activities are communicated to internal and external stakeholders as well as executive and management teams
    • Incident Response Management
    • Contact Center AI
    • Google Cloud Status Dashboard

Conclusion

Having Google Cloud aligned with the NIST CSF enables customers to improve their cloud security posture with appropriate risk management and industry-compliant cloud services. Encryption Consulting, a leading cyber-security firm, offers various GCP and NIST-related cybersecurity Cconsulting Services catering to its customers. Encryption Consulting will conduct a risk and security control maturity assessment based on the outlined standards. Encryption Consulting helps customers get familiar with NIST CSF and GCP security tools & documentation and assists them in conducting a meaningful and quantifiable cybersecurity assessment while keeping the organization’s business goals intact.

How to Automate PKI The Right Way?

In this discussion whiteboard, let us understand what is PKI? What are several components involved in Public Key Infrastructure (PKI)? Most importantly, how the recent global pandemic situation across the world is forcing companies to prefer remote working facilities and this in turn is posing a lot of threat for firm’s sensitive data. To secure the sensitive data, we need to understand how to scale the Public Key Infrastructure remotely in order to defend various data breach attacks. Let’s get into the topic:

What is Public Key Infrastructure – PKI?

PKI or Public Key Infrastructure is cyber security technology framework which protects the client – server communications. Certificates are used for authenticating the communication between client and server. PKI also uses X.509 certificates and Public keys for providing end-to-end encryption. In this way, both server and client can ensure trust on each other and check the authenticity for proving the integrity of the transaction. With the increase in digital transformation across the globe, it is highly critical to use Public Key Infrastructure for ensuring safe and secure transactions. PKI has vast use cases across several sectors and industries including Medical and Finance. 

What are important components in Public Key Infrastructure?

There are three key components: Digital CertificatesCertificate Authority, and Registration Authority. PKI can protect the environment using the three critical components. These components play a crucial role in protecting and securing digital communications, electronic transactions.

  • Digital Certificates:

    Most critical component in Public Key Infrastructure (PKI) is Digital certificates. These certificates are used to validate and identify the connections between server and client. This way, the connections formed are very secure and trusted. Certificates can be created individually depending on the scale of operations. If the requirement is for a large firm, PKI digital certificates can be purchased from trusted third party issuers.

  • Certificate Authority:

    Certificate Authority (CA) provides authentication and safeguards trust for the certificates used by the users. Whether it might be individual computer systems or servers, Certificate Authority ensures digital identities of the users is authenticated. Digital certificates issued through certificate authorities are trusted by devices.

  • Registration Authority:

    Registration Authority (RA) is an approved component by Certificate Authority for issuing certificates for authenticated users based requests. RA certificate requests ranges from individual digital certificate to sign email messages to companies planning to setup their own private certificate authority. RA sends all the approved requests to CA for certificate processing.

Why should firms automate their Public Key Infrastructure (PKI)?

Manually managing certificates and their lifecycle requires lot of technical expertise and skill. Also, huge amount of time is consumed for the certificate management process. Along with this criteria, there are high chances human errors creeping into the process. A simple error can prove very costly for your firm’s cyber security as it might lead to a data breach. In order to overcome the hurdles of finding experienced resources for managing the certificate lifecycle cyber security experts have come up with the process of automating PKI. This will not only save time and money for the organization but also satisfies the compliance and regulatory requirements. 

Enterprise PKI Services

Get complete end-to-end consultation support for all your PKI requirements!

What are the benefits of PKI automation?

As discussed before, firms are now looking towards automation of their Public Key Infrastructure to enhance the expertise in managing their certificates lifecycle and provide increased security for their high sensitive data. At a high level, there are three benefits identified for shifting towards PKI automation.

  • All-inclusive Data Security:

    PKI automation will help in drastically reducing the human errors which would result in increasing risk of data breach. Automation will help in managing the certificate lifecycle with precision. Activities such as certificate renewal and/or replacement can be performed on-time. PKI Automation ensures that all the machines which requires new certificate deployment or replacement are immediately addressed with accuracy. This will eliminate the any risk of non-compliance due to outdated certificates in critical systems.

  • Operational Efficiency

    Operational efficiency is an important parameter for any organization’s success. PKI automation will save ample amount of time that goes into manually managing the certificate lifecycle. Also, there will be better efficiency in handling the certificate activities. Leveraging automation of PKI will help in reducing the cost burden on the firms. Considering all the mentioned factors we can safely quote that operational efficiency will be enhanced through PKI automation. 

  • Business Continuity Management

    If there is one important lesson we learnt from the recent global pandemic is handling unexpected outages due to known and unknown factors. A recent survey provided data that poor certificate management is the major cause for system outrages. Manual handling of certificate management is the main reason for unwanted certificate expiry and improper deployment of new certificates. PKI automation process which includes automated discovery of endpoint machines, new certificate deployment and renewal or re-issuance of near expiry certificates will eliminate the risk of system outages and in-turn strengthens the Business continuity management of the organization.

How to automate PKI?

There are several ways to automate Public Key Infrastructure (PKI) depending on the organization requirements. You need to choose the appropriate implementation method to automate your PKI for enhanced efficiency. Method of implementation also depends on your Certificate Authority (CA) and its provision of APIs for integration. Let us discuss at a high level on four different ways to implement PKI automation. 

  • REST API Integration.
  • Simple Certificate Enrollment Protocol (SCEP).
  • Enrollment over Secure Transport (EST).
  • Active Directory Auto-Enrollment. 

One of the prominent and most common way of automating your PKI is using API integration. If your Certificate Authority (CA) and corresponding tools, software support API integration then you can leverage REST API Integration. You can perform API integration either from scratch where you develop your own scripts for making API calls with server for requesting certificate and passing it on to device. Other way is through leveraging the existing tools in market which will help in performing integration for automating PKI. Prominent software solutions such as Tanium, Casper, etc. provide you with integration support for automation.

Second option is SCEP. SCEP is an open-source certificate management protocol that stands for Simple Certificate Enrollment Protocol, automating the task of certificate issuance. SCEP is a readily available protocol supported by majority of operating systems such as Android, Microsoft windows, Linux, iOS and other major OS. This option requires SCEP agent on the device and works in concurrence with your enterprise device management tools. Enabling software sends script down the device for retrieving the certificate and configuration details hits SCEP service. One of the major advantage is SCEP agent is aware of retrieving certificates to the device.

To understand in detail about SCEP and its benefits please go through our elaborative article

Third option available for implementing PKI automation is EST – Enrollment over Secure Transport. EST is an enhancement to SCEP and provides all the functionalities we get from SCEP. Additional feature offered by EST is the support of Elliptic Curve Cryptography (ECC). Both SCEP and EST are used to automate the Certificate enrollment process, but the difference is that SCEP uses Shared Secret protocol and CSRs for enrolling Certificates, whereas EST uses TLS for authentication. EST uses TLS to securely transport the messages and Certificates, whereas SCEP uses PkcsPKIEnvelope envelopes to secure the messages.

Last option for our discussion to automate certificate management is Microsoft Active Directory (AD) Auto-Enrollment. Windows PCs and servers can utilize this option using Microsoft certificate store. Services such as Internet Information Services (IIS), Exchange server uses Microsoft certificate store for auto Enrollment. As you can understand, this option will be only applicable on Windows machines which use Microsoft services.

Finally, which option to choose for implementing PKI automation is solely and completely dependent on the organization’s IT infrastructure. Consulting firms like us will come into play in this step of selecting the implementation of PKI automation with less effort, overheads and more efficiency.

Enterprise PKI Services

Get complete end-to-end consultation support for all your PKI requirements!

Encryption Consulting’s Managed PKI

Encryption Consulting LLC (EC) will completely offload the Public Key Infrastructure environment, which means EC will take care of building the PKI infrastructure to lead and manage the PKI environment (on-premises, PKI in the cloud, cloud-based hybrid PKI infrastructure) of your organization.

Encryption Consulting will deploy and support your PKI using a fully developed and tested set of procedures and audited processes. Admin rights to your Active Directory will not be required and control over your PKI and its associated business processes will always remain with you. Furthermore, for security reasons the CA keys will be held in FIPS 140-2 Level 3 HSMs hosted either in in your secure datacentre or in our Encryption Consulting datacentre in Dallas, Texas.

Conclusion

Encryption Consulting’s PKI-as-a-Service, or managed PKI, allows you to get all the benefits of a well-run PKI without the operational complexity and cost of operating the software and hardware required to run the show. Your teams still maintain the control they need over day-to-day operations while offloading back-end tasks to a trusted team of PKI experts.

Aligning to the NIST Cybersecurity Framework in the AWS Cloud

Let’s define NIST Cyber Security Framework in brief. 

The NIST Cyber Security Framework known as NIST CSF is a cybersecurity assessment-type framework developed by the NIST (National Institute of Standards and Technology). The core purpose of the NIST CSF is to protect the nation’s critical infrastructure using a set of cybersecurity best practices and recommendations. It’s a voluntary, risk-based, and outcome-oriented cybersecurity framework to help your organization to categorize its security activities around five key functions 1) Identify 2) Protect, 3) Detect, 4) Respond, and 5) Recover.

 Let’s look at each function briefly:

Identify – The Identify function assist you to evolve an overall cybersecurity risk management approach to systems, people, assets, data, and capabilities in the organization. It helps you to identify the critical assets, overall business environment, governance model, and supply chain. 

Protect – The protect function helps you to set up defensive controls based on the inputs from identify function such as critical assets, risk tolerance/acceptance levels. It also emphasizes the importance of access control & identity management, protecting data, and training & awareness to users. 

Detect – The detection functions help you to detect anomalies, malicious activities, and other events effectively by continuous security monitoring and with the help of other detection processes & procedures. 

Respond – To complete the detection function, respond helps you to take the right action immediately through incident response planning, mitigation actions for events, accurate analysis, communication to the designated stakeholders, and continuous improvement with each event.

Recover – Recover function assists you to get back to the pre-attack condition with the help of recovery planning, continuous improvement, and communication to the designated stakeholders.

NIST Cyber Security Framework Overview: Core, Tiers, and Profile

The NIST CSF consists of three sections:

The core section represents cybersecurity practices, technical, operational, process security controls, and outcomes that support the five risk management functions such as Identify, Protect, Detect, Respond, and Recover.

The tiers section emphasizes the organization’s processes of managing risks while remaining aligned with NIST CSF.

The profiles characterize how effectively an organization’s cybersecurity program is managing its risk. It also expresses the state of an organization’s “as is” and ‘’to be’’ cybersecurity postures.

Tailored Cloud Key Management Services

Get flexible and customizable consultation services that align with your cloud requirements.


NIST Cyber Security Framework and AWS Cloud

Earlier AWS team published a guide on how to implement the NIST CSF in an AWS cloud environment. AWS recommends using NIST CSF as a mechanism to have baseline security in place that can improve the cloud security objectives of an organization. NIST CSF contains a comprehensive controls catalogue derived from the ISO/IEC 27001 (1), NIST SP 800-53 (2), COBIT (3), ANSI/ISA-62443 (4), and the Top 20 Critical Security Controls (CSC) (5).

There is a listing on the AWS portal that specifies the alignment of NIST CSF to various AWS services that are known as “AWS Services and Customer Responsibility matrix for Alignment to the CSF” (6). This is a comprehensive list that customers can use to align their needs with the CSF in the AWS cloud for their security requirements. Also, this enables the customer to design their baseline security requirements to meet their security goals.

AWS Cloud Adoption Framework

Before setting up a baseline, it is important for a customer to have a clear understanding of their business use cases and the customer-owned responsibilities for “security in the AWS cloud”. The customer should review the “AWS Cloud Adoption Framework” (7) to evaluate the governance model that will be required while implementing the NIST CSF into the AWS cloud services. The AWS CAF (Cloud Adoption Framework) lists pointers known as “CAF Perspectives” to identify gaps in security skills, capabilities, and cybersecurity processes.

NIST CSF Functions and Responsibilities (Customer-owned & AWS-owned)

AWS team has come up with the concept of NIST CSF Functions categories & sub-categories into 108-outcome based security activities. Every function depicts the Customer-owned and AWS-owned responsibilities that mean security of the cloud owned by AWS and security in the cloud owned by the Customer. Business owners/stakeholders can use the AWS link of “AWS Services and Customer Responsibility matrix for Alignment to the CSF” to tailor their needs as per the organization’s tiers and profile level in the CSF.

The below figure represents the CSF core functions (Identify, Protect, Detect, Respond, and Recover) with categories defined and those that have been converted to 108-outcome based security activities (8) by AWS.

Till now we have discussed the NIST CSF alignment with the AWS Cloud Services and how the customer can use CAF (Cloud Adoption Framework) to evaluate the skill gap, capability, and cybersecurity processes using the CAF Perspectives.    

Let’s discuss how appropriate AWS services can be leveraged to set up effective Security Architecture using NIST Cyber Security Framework.

The table below provides a summarized view of AWS Cloud Services categorized into the NIST CSF Core Functions based on the nature of the service:

#IdentifyProtectDetectRespondRecover
1OrganizationsShieldGuardDutyCloudWatchOpsWorks
2Security HubCertificate ManagerMacieLambdaCloudFormation
3ConfigKMSInspectorDetectiveS3 Glacier
4Trusted AdvisorNetwork FirewallSecurity HubCloudTrailSnapshot
5Systems ManagerWAF Systems ManagerArchive
6Control TowerFirewall Manager Step FunctionsCloudEndure Disaster Recovery
7 CloudHSM   
8 IAM   
9 Direct Connect   
10VPC    
11 Single-Sign-On   

Conclusion

Having the AWS Cloud Services aligned with the NIST CSF enables the customer to improve their cloud security posture with appropriate risk management and industry-compliant cloud services. Encryption Consulting, a leading cyber-security firm, offers various AWS and NIST related cybersecurity consulting Services catering to its customers a risk and security control maturity assessment based on the outlined standards. Encryption Consulting helps customers to get them familiarized with NIST CSF and AWS security tools & documentation and assist them in conducting a meaningful and quantifiable cybersecurity assessment while keeping the organization’s business goals intact.

Resources
  1. ISO/IEC 27001:2013, Information Technology – Security techniques – Information Security management systems – Requirements. ISO. Retrieved February 18, 2021, from: https://www.iso.org/standard/54534.html
  2. NIST Special Publication (SP) 800-53, Rev. 5, Security and Privacy Controls for Information Systems and Organizations. National Institute for Standards and Technology. Retrieved February 18, 2021, from: https://csrc.nist.gov/publications/detail/sp/800-53/rev-5/final
  3. Control Objectives for Information and Related Technology (COBIT), an ISACA Framework. Information Systems Audit and Control Association (ISACA). Retrieved February 18, 2021 from: https://www.isaca.org/resources/cobit
  4. ANSI/ISA-62443-2-4-2018 / IEC 62443-2-4:2015+AMD1:2017 CSV, Security for industrial automation and control systems. International Society of Automation (ISACA).
  5. The 20 CIS Controls & Resources. Center for Internet Security (CIS). Retrieved February 18, 2021, from: https://www.cisecurity.org/controls/cis-controls-list/
  6. AWS Services and Customer Responsibility Matrix for Alignment to the CSF can be downloaded from here: https://aws.amazon.com/compliance/nist/
  7. An overview of the AWS Cloud Adoption Framework (CAF), Ver. 2. Amazon Web Services, Inc.
  8. An overview of AWS capabilities that can be leveraged with NIST CSF: https://d1.awsstatic.com/whitepapers/compliance/NIST_Cybersecurity_Framework_CSF.pdf

Everything You Need To Know About Diffie-Hellman Key Exchange Vs. RSA

What is Diffie-Hellman (DH) Key Exchange?

Diffie-Hellman (DH), also known as an exponential key exchange, was published in 1976. DH key exchange is a key exchange protocol that allows the sender and receiver to communicate over a public channel to establish a mutual secret without being transmitted over the internet. DH securely generates a unique session key for encryption and decryption that has the additional property of forwarding secrecy.

In short, the trick is to use a mathematical function that’s easy to calculate in one direction but very difficult to reverse, even when some of the aspects of the exchange are known.

As a typical example with Alice and Bob:

  • Let’s say Alice and Bob agreed on a random color, “yellow,” to start with.
  • Alice and Bob set a private color for themselves, and they do not let the other party know what color they chose. Let’s assume Alice decides “red” and Bob decides “Aqua.”
  • Next, Alice and Bob combine their secret color ( Alice-red; Bob: Aqua) with the “yellow” (“Common color.”)
  • Once they have combined the colors, they send the result to the other party. For example, Alice obtains “Sky Blue,” and Bob receives “orange.”
  • Once they have received the combined results of their partners, they then add their secret colors to it. For example, Alice adds the sky blue with the red, and bob adds the Aqua with the orange.
  • As a result, they both come out with the same color, “Brown.”

The crucial part of the DH key exchange is that both parties end up with the same color without ever sending the common secret across the communication channel. Thus, if an attacker tries to listen to the exchange, it is challenging for the attacker to find the two colors used to get the mixed color (Brown).

Is the Diffie-Hellman key exchange used in modern cryptography?

Yes,  Diffie-Hellman is used in modern crypto. It is the standard for generating a session key in public. The algorithm has a high processor overhead; it is not used for bulk or stream encryption but rather to create the initial session key for starting the encrypted session. Afterward, under the protection of this session key, other cryptographic protocols negotiate and trade keys for the remainder of the encrypted session. Think of DH as an expensive method of passing that initial secret. The more efficient and specialized cryptographic algorithms can protect the confidentiality of the remainder of the session.

Uses of Diffie-Hellman

DH is one of the most popular key exchange protocols. There are various uses of DH to support software and hardware.

  • While using DH key exchange, the sender and receiver have no prior knowledge of each other.
  • Communication can take place through an insecure channel.
  • Public Key Infrastructure (PKI)
  • Secure Socket Layer (SSL)
  • Transport Layer Security (TLS)
  • Secure Shell (SSH)
  • Internet protocol security (IPsec)

Limitations of Diffie-Hellman

  • Does not authenticate either party involved in the exchange.
  • It cannot be used for asymmetric exchange.
  • It cannot be used to encrypt messages.
  • It cannot be used to digital signature

Tailored Encryption Services

We assess, strategize & implement encryption strategies and solutions.

What is RSA Algorithm?

RSA Algorithm is used to perform public-key cryptography. In the RSA Algorithm, the sender encrypts the sender (Bob) encrypts the data to be transferred using his/her public key, and the receiver (Alice) decrypts the encrypted data using his/her private key.

A typical example, how public key cryptography works?

In public-key cryptography, it uses two keys, one key to encrypt the data and the other key to decrypt it. The data sender will keep the private secret key and send the public key to all the receivers or recipients of the data. The below diagram shows how public key cryptography works.

Public Key Cryptography
  • Bob uses Alice’s public key to encrypt the message and sends it to Alice.
  • Alice will use her private key to decrypt the message and get the plain text.

Uses of RSA

RSA has widely used cryptography in a network environment, and it supports the software and hardware as mentioned below:

  • Assures confidentiality, integrity, and authentication of electronic communication.
  • Secure electronic communication.
  • RSA is used in security protocols such as IPsec, TLS/SSL, SSH.
  • Used for signing digital signature.
  • High-speed and straightforward encryption.
  • Easy to implement and understand.
  • It prevents the third party from intercepting messages.

Limitations of RSA

  • Prolonged key generation.
  • Vulnerable when it comes to Key exchange if poorly implemented.
  • Slow signing and decryption process.
  • RSA doesn’t provide perfect forward secrecy

Diffie- Hellman Key Exchange Vs. RSA

Asymmetric key or public key cryptographic algorithm is far more superior to symmetric key cryptography when the security of confidential data is concerned. The asymmetric key includes many cryptographic algorithms. Both Diffie- Hellman Key Exchange and RSA have advantages and disadvantages. Both algorithms can be modified for better performance. RSA can be mixed with ECC to improve security and performance. DH can be integrated with digital and public key certificates to prevent attacks.

ParametersRSADiffie-Hellman (DH) Key Exchange
Public Key encryption algorithmRSA uses the public-key encryption algorithm.DH also uses the Public-key encryption algorithm.
PurposeStorage enough for commercial purpose like online shopping.Storage enough for commercial purposes.
AuthenticationAssures confidentiality, integrity, and authentication of electronic communication.Does not authenticate either party involved in the exchange
Key StrengthRSA 1024 bits is less robust than Diffie-Hellman.Diffie-Hellman 1024 bits is much more robust.
AttacksSusceptible to low exponent, typical modulus, and cycle attack.Sensitive to man in the middle attack.
Forward SecrecyRSA doesn’t provide perfect forward secrecy.Forward secrecy is in DH key exchange.

Conclusion

While the Diffie-Hellman key exchange may seem complex, it is fundamental to security exchanging data online. As long as it is implemented alongside an appropriate authentication method and the numbers have been appropriately selected, it is not considered vulnerable to attack. The DH  key exchange was an innovative method for helping two unknown parties communicate safely when it was developed in 1976. While we now implement newer versions with larger keys to protect against modern technology, the protocol itself looks like it will continue to be secure until the arrival of quantum computing and the advanced attacks that will come with it.

RSA doesn’t provide perfect forward secrecy, which is another disadvantage compared to the ephemeral Diffie-Hellman key exchange. Collectively, these reasons are why, in many situations, it’s best only to apply RSA in conjunction with the Diffie-Hellman key exchange.

Alternatively, the DH key exchange can be combined with an algorithm like the Digital Signature Standard (DSS) to provide authentication, key exchange, confidentiality, and check the integrity of the data. In such a situation, RSA is not necessary for securing the connection.

The security of both DH and RSA depends on how it is implemented. It isn’t easy to come to a conclusion which one is more superior to the other. You will usually prefer RSA over DH and vice-versa based on interoperability constraints and depending on the context.

Resources:

Your Guide To Scaling PKI Remotely

In this discussion whiteboard, let us understand what is PKI? What are several components involved in Public Key Infrastructure (PKI)? Most importantly, how the recent global pandemic situation across the world is forcing companies to prefer remote working facilities and this in turn is posing a lot of threat for firm’s sensitive data. To secure the sensitive data, we need to understand how to scale the Public Key Infrastructure remotely in order to defend various data breach attacks. Let’s get into the topic:

Is still Cyber security practices such as Public Key Infrastructure still relevant during COVID-19 Pandemic Era?

To answer this question, we need to understand the findings from the survey conducted by PwC to understand the financial measures CFOs are considering during the COVID-19 global pandemic to reduce their business impact and continue sustainability. An interesting reveal from this survey is that out of all the CFOs who responded to the survey, 67% are considering cancelling or deferring planned investments to reduce the financial burden on their firms.

Out of the 67%, only 2% are considering cutting planned activities in Cyber security, while the rest are not willing to slide down the budget on data protection. This clearly indicates the importance of Cyber security, especially encryption and PKI, during pandemic situations where data is spread across places, as many of the employees are working from remote locations.

What made Cyber Security especially Public Key Infrastructure (PKI) critical during COVID-19?

It is a well-known fact that Cyber Security is critical to any firm with sensitive data, even before the COVID-19 pandemic hit the globe. During the COVID-19 pandemic crisis, this aspect of cyber security became even more critical with employees handling sensitive data all over the world working remotely. This complicates the process of tracking down the sensitive data (at rest, in transit and in use) and protecting it.

So, handling Public Key Infrastructure (PKI) remotely became critical for the revocation of short-lived certificates and managing the existing, live certificates. Also, managing PKI remotely is highly critical for compliance purposes as there might be huge penalties companies have to face for non-compliance to several international standards. Public Key Infrastructure (PKI) can be leveraged for protecting and performing email, VPN, user authentication, and website certificate management. PKI has become a business-critical asset during the COVID-19 global pandemic in the Cyber Security domain.

What is PKI?

PKI, or Public Key Infrastructure, is a cyber security technology framework which protects client – server communications. Certificates are used for authenticating the communication between client and server. PKI also uses X.509 certificates and public keys for providing end-to-end encryption. In this way, both server and client can ensure trust in each other and check their authenticity for proving the integrity of the transaction. With the increase in digital transformation across the globe, it is highly critical to use Public Key Infrastructure for ensuring safe and secure transactions. PKI has vast use cases across several sectors and industries, including the Medical and Finance fields.

Explore the complete information about Public Key Infrastructure here:

What are important components in Public Key Infrastructure?

There are three key components: Digital CertificatesCertificate Authority, and Registration Authority. PKI can protect the environment using the three critical components. These components play a crucial role in protecting and securing digital communications, electronic transactions.

  • Digital Certificates: Most critical component in Public Key Infrastructure (PKI) is Digital certificates. These certificates are used to validate and identify the connections between server and client. This way, the connections formed are very secure and trusted. Certificates can be created individually depending on the scale of operations. If the requirement is for a large firm, PKI digital certificates can be purchased from trusted third party issuers.
  • Certificate Authority: Certificate Authority (CA) provides authentication and safeguards trust for the certificates used by the users. Whether it might be individual computer systems or servers, Certificate Authority ensures digital identities of the users is authenticated. Digital certificates issued through certificate authorities are trusted by devices.
  • Registration Authority: Registration Authority (RA) is an approved component by Certificate Authority for issuing certificates for authenticated users based requests.  RA certificate requests ranges from individual digital certificate to sign email messages to companies planning to setup their own private certificate authority. RA sends all the approved requests to CA for certificate processing.

That should have given you a good answer to the question how does a PKI work. Now let’s learn why you should scale your PKI remotely.

Enterprise PKI Services

Get complete end-to-end consultation support for all your PKI requirements!

Why should firms worry about scaling PKI remotely?

COVID-19 has not only created a health crisis across the globe, but it also created a havoc in the cyber space, creating a cyber pandemic as well. There has been a multi-fold increase in the number of cyber-attacks right from the start of the COVID-19 pandemic. Cyber-criminals are exploiting the current situation of remote working facilities of employees and newly deployed remote access solutions for cyber-attacks. Numbers suggest that during the initial days of the global pandemic, there was an increase of 33% in the volume of cyber-attacks. Recent attacks on one of the largest gas pipeline and Meat supplier suggest that even major firms with huge infrastructures are no exception for these attacks.

Why use PKI?

There are several good traditional cyber security mechanisms, such as multi-factor authentication and password-based protection, implemented for securing sensitive data remotely, but these techniques are no longer fool proof with cyber criminals easily manipulating the aforementioned mechanisms and breaching secured walls. Cybercriminals are able to breach these techniques, so many cyber security research organizations are suggesting to move away from these approaches. Leveraging Public Key Infrastructure to implement certificate-based authentication provides better enhanced security for sensitive data when compared to the traditional approaches.

How can you leverage Public Key Infrastructure (PKI) remotely?

Public Key Infrastructure (PKI) can provide better and stronger security standards when compared with password-based protection or multi-factor authentication, which are often in use for protecting sensitive data. As several research firms, like Forrester and Gartner say, it is always preferred to go with a “Zero Trust Security Model” to reduce the risk of exposing your business and employees. PKI can be one of the most important layers in achieving a “Zero Trust” strategy. There are three critical steps that can be followed by your organization to scale Public Key Infrastructure remotely to protect data spread across different locations:

  1. PKI certificate-based authentication can be used to replace traditional password-based protection.
  2. PKI certificate authentication can be used to replace traditional multi-factor authentication.
  3. Automation of identity certificate management can also be implemented.

PKI Certificate based authentication vs Password based protection

As per the “Data Breach Investigations 2019 report by Verizon”, 62% of breaches are caused by either phishing, stolen credentials, or brute force. From this research data, we can deduce that the majority of data breaches involved password leakage either willingly or by accident or they were done through hacking techniques, such as brute force attacks, which makes this protection technique more vulnerable.

On the other hand, PKI-based user identity certificates used in certificate-based authentication can be considered one of the strongest forms of identity authentication. This also eases the process for employees, as they are not required to remember and update passwords frequently. In certificate-based authentication, digital certificates are used for user authentication.

Reasons why PKI based authentication is better:

  • Private Key is used for authentication which can always reside in the client environment.
  • Private Key/Certificates cannot be stolen in-transit or at-rest (in server repositories).
  • Unlike passwords, digital certificates can take several years to decrypt using brute force attacks.
  • There is no requirement to remember or frequently change digital certificates like passwords.

PKI certificate authentication vs Traditional multi factor authentication

It is a known fact that multi factor authentication, either via hardware token device or mobile SMS/call-based authentication, will provide additional security when compared to only using password-based protection. Unfortunately, this is a cumbersome process for employees as there are extra steps involved in going through the authentication cycle. PKI certificate-based authentication will help in eliminating this extra step and still be able to provide stronger data security.

Advantages of using PKI certificate authentication over traditional multi factor authentication are:

  • Employees need not worry about carrying and securing extra hardware tokens or devices for additional security.
  • Extra step of entering secure token ID or One time password (OTP) can be avoided.
  • Connected devices can be trusted and authenticated.
  • Using PKI certificate authentication, you can achieve several use cases for multiple entities such as users, machines and devices (mobile).
  • Using PKI, you can satisfy multiple use cases such as user authentication, machine authentication, windows logon, accessing corporate emails, VPN access to name a few.

Automation of identity certificate management

The final step in scaling PKI remotely is to automate the process of certificate management. This will reduce the burden on IT staff by eliminating the intensive process of certificate deployment, renewal, and revocation. This will help in quickly replacing or revoking certificates by IT staff.

Benefits of automating certificate lifecycle:

  • Certificate discovery: Performing discover activity to identify certificates in use across the business landscape.
  • Certificate Deployment: Automated issuance of certificates and installation.
  • Certificate Review: Automatically renew the certificates wherever necessary and revoke them if expired.

Encryption Consulting’s Managed PKI’s

Encryption Consulting LLC (EC) will completely offload the Public Key Infrastructure environment, which means EC will take care of building the PKI infrastructure to lead and manage the PKI environment (on-premises, PKI in the cloud, cloud-based hybrid PKI infrastructure) of your organization.

Encryption Consulting will deploy and support your PKI using a fully developed and tested set of procedures and audited processes. Admin rights to your Active Directory will not be required and control over your PKI and its associated business processes will always remain with you. Furthermore, for security reasons the CA keys will be held in FIPS 140-2 Level 3 HSMs hosted either in in your secure datacentre or in our Encryption Consulting datacentre in Dallas, Texas.

Is Your PKI Healthy?

Six Key Factors that decide PKI Health

In this discussion, we are trying to understand a few of the following questions: What is PKI? What are several components involved in Public Key Infrastructure (PKI)? Most importantly, what are the key factors that can be leveraged to perform a PKI health check? Health checks with appropriate deciding factors are critical for ensuring the health of a Public Key Infrastructure. Let’s dive into the topic:

What is Public Key Infrastructure – PKI?

PKI or Public Key Infrastructure is cyber security technology framework which protects the client – server communications. Certificates are used for authenticating the communication between client and server. PKI also uses X.509 certificates and Public keys for providing end-to-end encryption. In this way, both server and client can ensure trust of each other and check their authenticity for proving the integrity of the transaction. With the increase in digital transformation across the globe, it is highly critical to use Public Key Infrastructure for ensuring safe and secure transactions. PKI has vast use cases across several sectors and industries, including Medical and Finance.

What are important components in a Public Key Infrastructure?

There are three key components to a PKI: Digital Certificates, Certificate Authority, and Registration Authority. PKIs can protect the environment using these three critical components. These components play a crucial role in protecting and securing digital communications and electronic transactions.

  • Digital Certificates: The most critical component in a Public Key Infrastructure (PKI) is Digital certificates. These certificates are used to validate and identify the connections between server and client. This way, the connections formed are very secure and can be trusted. Certificates can be created individually depending on the scale of operations. If the requirement is for a large firm, PKI digital certificates can be purchased from trusted third-party issuers. These are the reasons why PKI certificate management is so vital.
  • Certificate Authority: A Certificate Authority (CA) provides authentication and safeguards the certificates used by the users. Whether it is individual computer systems or servers, the Certificate Authority ensures the digital identities of the users are authenticated. Digital certificates issued through certificate authorities are trusted by devices.
  • Registration Authority: The Registration Authority (RA) is an approved component by the Certificate Authority for issuing certificates for authenticated user-based requests. RA certificate requests range from individual digital certificates to signed email messages to companies planning to setup their own private Certificate Authority. The RA sends all the approved requests to the CA for certificate processing.

Why should firms worry about PKI Health?

A Public Key Infrastructure is not a one-time setup and forget activity. Regular health monitoring is as important as the initial implementation of your PKI, as it plays a crucial and deciding role in the firm’s cyber security. PKI Health monitoring and checking activity will ensure that a steady state of operations is achieved. The majority of certificate policies state that an audit has to be performed on a regular basis to safeguard the compliance of the Certificate Authorities (CAs). It is highly advisable to perform a complete check once a year, at least. 

Public Key Infrastructure health checks involve multiple steps and factors. Out of all these, some of the important processes that are included in a standard PKI health check are:

  • Patch management and backup.
  • Certificate checks: Issuing and revoking of certificates.
  • Auditing of the Certificate Authority

Six Key Factors / Indicators that decide PKI Health:

Public key infrastructures, or PKIs, have been around for a considerable amount of time. Most businesses are well aware of their benefits and capabilities by now. However, due to ever-growing cyber threats, it’s important to continually check for major signs of vulnerabilities. It’s important to do an analysis and fix the areas where a fix is needed. This early analysis is a proactive and vigilant measure that significantly reduces the risk of any cyber-attack. If it is passed over without giving it due attention, it may result in a loss of customer data or regulatory penalties.

To ensure PKI health, we have noted six factors that should be observed during the early stages.

  • Certificate Validity – All digital certificates have expiry dates. For security reasons, it is unwise to reuse the same digital certificate for a long duration of time without any oversight. If the expiry date is not documented and tracked in an orderly fashion, then the chance of a breach increases. An expired digital certificate provides no security at all.
    A good practice is to document the certificate lifecycle of each digital certificate in use and keep it updated. Along with this, ensuring you have strong PKI certificate management processes in place is also extremely important. This process can also be automated with the help of various certificate management tools. Early notification can also be configured, that way all stakeholders are notified before the issuance or renewal of a certificate.
  • Certificate Integrity – To convince customers or potential customers to transact or share information over the Internet trust must first be established. One way to do that technically is by using a proper Certificate Authority. These are entities that verify the authenticity of a web-based service or product. Certificate Authorities prevent phishing attempts since they verify SSL/TLS certificates. Digital certificates which are verified by some known Certificate Authority are considered safe and many modern browsers and tools help identify that.
    All stakeholders who are associated with the issuance of digital certificates should ensure that the certificates have all the parameters which are required to get them verified by the CA. The process should also be documented, with a clear depiction of association and accountability. This helps in various situations such as seamless renewal of an expired certificate, replacing a corrupted certificate, tracking a compromised certificate in a security event, and taking required preventative actions.
  • Certificate Issuance Policy – A Certificate Authority can issue certain restrictions during the issuance of a certificate. There can be varied restrictions such as restricting or force allowing X.509 values, restricting allowed subject fields or allowed issuance modes, etc.
    If these are kept in check then backtracking and troubleshooting becomes much easier. The stakeholders should be responsible for tracking all issuance policies associated with each certificate.
  • Certificate Endpoints – SSL certificates can be added to endpoints. Sometimes one digital certificate can be associated with multiple endpoints. In such cases, it is important to track it. Hence, each one is updated in scenarios. Like, if the certificate is expired and renewed. These endpoints can become vulnerable and exposed if not tracked appropriately.
  • Encryption Key Size – The size of the encryption key is correlated and proportional to key strength. Keys such as RSA 4096 provide high security and assurance because of their larger size, which makes brute force attacks very difficult.
    It is important to check for any key that is used which has a small bit size and is therefore weaker. For better PKI health, all existing weak keys should be replaced with stronger ones.
  • Encryption Algorithm – A healthy PKI should always contain a strong and robust hashing algorithm. Algorithms keep on getting updated over time to become stronger and swifter. For example, SHA256 is much more secure than say, SHA1. Stakeholders should keep track of what algorithm is being used and if that is the industry standard or not. In the case that any stable update is available, it is better to replace the outdated algorithm.

Enterprise PKI Services

Get complete end-to-end consultation support for all your PKI requirements!

How will PKI health checks benefit firms?

  • Performing regular PKI health checks will ensure a strong overall cyber security posture for the organization.
  • Operational effectiveness will be monitored on a regular basis by performing PKI health checking activity regularly.
  • Compliance with regulatory standards and frameworks will be ensured as there are periodic checks on certificate health.
  • Threat vectors of data loss will be reduced considerably with a reduction in risk.
    High availability of critical processes will ensure smooth running of the business.

Encryption Consulting’s Managed PKI’s

Encryption Consulting LLC (EC) will completely offload the Public Key Infrastructure environment, which means EC will take care of building the PKI infrastructure to lead and manage the PKI environment (on-premises, PKI in the cloud, cloud-based hybrid PKI infrastructure) of your organization.

Encryption Consulting will deploy and support your PKI using a fully developed and tested set of procedures and audited processes. Admin rights to your Active Directory will not be required and control over your PKI and its associated business processes will always remain with you. Furthermore, for security reasons the CA keys will be held in FIPS 140-2 Level 3 HSMs hosted either in in your secure datacentre or in our Encryption Consulting datacentre in Dallas, Texas. 

Why You Need To Know About HSTS and SSL Stripping Attack?

Users and organizations are valuing security and privacy more and more every day. Technologies such as HTTPS, a combination of HTTP and SSL/TLS protocols, are created to provide confidentiality and integrity of web browsing. Many organizations have taken measures to prompt the deployment of HTTPS. Since 2014, Google has improved rankings of the websites which deploy HTTPS. Furthermore, in Chrome, the websites that do not deploy HTTPS cannot use geographic locations and the application cache. Also, users may not even be able to visit websites that do not have HTTPS enabled. Eventually, they will result in an unsafe symbol in the address bar of the Chrome browser. In the past, obtaining and maintaining digital certificates would cost a lot. Therefore, small companies or big companies with many domain names might not deploy HTTPS for the cause of expense. Recently, Let’s Encrypt, with its ACME protocol, makes it easier and cheaper to obtain SSL/TLS certificates, and they provide Domain Validation (DV) certificates for free through a fully automated process. Apart from Let’s Encrypt, several content distribution networks and cloud service providers, including Cloudflare and Amazon, provide free TLS certificates to their customers.

However, there are still many HTTP connections that exist on the Internet. To handle the mix of HTTP and HTTPS connections seamlessly is difficult for browsers due to the stripping attack. HTTPS stripping attacks have raised widespread concerns since Marlinspike put forward sslstrip at the blackhat conference in 2009. Attackers can intercept the communication between the target website and the client and change all HTTPS into HTTP in the response packets from the website. Even though this attack violates the rule that states TLS/SSL should ensure end-to-end security, neither the client nor the server can be aware of the attack because the packets sent from servers are still encrypted.

To defend against the stripping attack, HTTP Strict Transport Security (HSTS) protocol was presented in 2012. It defines a mechanism enabling websites to declare themselves accessible only via secure connections.

Stripping attack

A few years ago, HTTPS was deployed only in financial or e-commerce payment pages or login pages. However, more and more sites began to deploy HTTPS. One of the reasons is that many studies show that the site owners should provide HTTPS service on all site pages, including whole resource files, and thus encryption of part of the sites is proven unsafe. Another reason is the emergence of free certificates and the TLS accelerator. Maintaining HTTPS service was very expensive, which contained the cost of applying certificates, the cost of updating certificates, and the performance overhead caused by extra encryption or decryption. Fortunately, these problems have been solved in recent years. Many organizations began to provide free TLS/SSL certificates, and websites greatly benefited from HTTPS.

Nonetheless, an HTTPS stripping attack poses a risk to HTTPS. When users type a domain name without protocol type (HTTP or HTTPS), the default request type is HTTP rather than HTTPS. Usually, if the server provides HTTPS service, the server will give a 302 redirection after receiving an HTTP request. However, the attacker can intercept the traffic through ARP spoofing and replace all HTTPS with HTTP in the response packet. Thus, the browser will still request an HTTP website regardless of the 302 redirections. Again, the attacker can replace all HTTP with HTTPS in the request packet. The communication between the attacker and the server is encrypted, but the communication between the attacker and the browser is in plaintext. This attack is called HTTPS stripping attack, which browsers or servers cannot detect as it follows the HTTP communication protocol.

HSTS protocol

To avoid the HTTPS stripping attack, HTTP Strict Transport Security (HSTS) policy was created in 2012. Websites declare the policy via the Strict-Transport-Security HTTP response header field or other means, such as user-agent configuration. If the server wants to provide HTTPS service all the time, it will send an HSTS header to the browser. According to the information in headers, the browser will remember the domains which want to force to be visited by HTTPS. And when users send an HTTP request next time, the browser automatically converts HTTP to HTTPS in the background. The HSTS policy defines the standard of HSTS headers, and the headers mainly consist of three fields. The first is the max-age field, which implies the expiration time, and it is mandatory. The second is the optional includeSubdomains field, which indicates whether the HSTS policy applies to the domain’s subdomains. The last one is the preload field, and it is also optional. This field indicates whether the domain has been permanently added to the preload list maintained by browser providers. It is essential that HTTPS requests can only send these headers; hence the attacker cannot arbitrarily tamper with the HSTS policy to disable it.

Tailored Encryption Services

We assess, strategize & implement encryption strategies and solutions.

Conclusion

HSTS protocol provides a way to avoid SSL stripping attacks, but many sites owner or developers do not understand the HSTS policy well. SSL stripping attack does provide attackers a way to read a user visiting a website in plaintext. The data between the browser and the attacker is not encrypted, giving plaintext access to the whole data, including passwords, credit card information, and more. Using HSTS protocol avoids these issues and forces a browser to use encryption between the server and the browser.

Secure Your Infrastructure With Certificates Using AWS Certificate Manager

digital certificate is a crucial component in securing infrastructure and ensuring the authenticity of users, applications, devices, servers, and more. The digital certificate provides a way to perform authentication and authorize services and execute tasks such as initiating HTTPS connection, establishing an encrypted connection using asymmetric encryption, and check if a user, website, or device is authentic. Digital certificates replace the username/password combination used for authentication and also introduces more functionalities. For example, if two parties intend to initiate a secure connection using public keys, the public key would be attached to the digital certificate. This rapidly reduces the chances of man in the middle attacks and keeps the connection secure.

But managing these digital certificates needs proper infrastructure. Digital certificates are issued by Certificate Authorities (CA). If a public trusted CA issues the digital certificate, then all browsers would automatically trust the certificate after checking if the certificate is valid. If the CA is not trusted, or the certificate is self-signed (implying not issued by CA), then either we would need to explicitly trust the certificate or get a warning in the browser.

AWS Certificate Manager (ACM) provides a way to create, store and renew public and private SSL/TLS X.509 certificates, including the public and private keys. These certificates can be used to secure websites and applications hosted on AWS.

Amazon Web Services Certificate Manager (ACM)

AWS Certificate Manager is a service by Amazon that lets a user provision, manage, and deploy public and private SSL/TLS certificates that can be used with AWS services and internal connected resources. SSL/TLS certificates would be used to establish a secure network connection and prove a website’s identity and resources in a private network. ACM acts to purchase, manage and renew SSL/TLS certificates and deploy them into the infrastructure, directly saving time and improving manageability.
AWS offers two options to customers deploying managed X.509 certificates. Organizations can choose the best one for their needs.

  • AWS Certificate Manager (ACM)
  • ACM Private CA

ACM Private CA

ACM Private CA is a service for enterprise customers building a public key infrastructure (PKI) inside the AWS cloud and intended for private use within an organization. With ACM Private CA, users can create their certificate authority (CA) hierarchy and issue certificates to authenticate users, computers, applications, services, servers, and other devices. Certificates issued by a private CA cannot be used on the internet.

ACM Certificate

AWS Certificate Manager generates X.509 version 3 certificates. Each certificate is valid for 13 months and contains the following extensions:

  • Basic Constraints- specifies whether the subject of the certificate is a certification authority (CA).
  • Authority Key Identifier- enables identification of the public key corresponding to the private key used to sign the certificate.
  • Subject Key Identifier- enables identification of certificates that contain a particular public key.
  • Key Usage- defines the purpose of the public key embedded in the certificate.
  • Extended Key Usage- specifies one or more purposes for which the public key may be used in addition to the purposes identified by the Key Usage extension.
  • CRL Distribution Points- specifies where CRL information can be obtained.

ACM Root CAs

Distinguished NameEncryption Algorithm
CN=Amazon Root CA 1, O=Amazon, C=US2048-bit RSA (RSA_2048)
CN=Amazon Root CA 2, O=Amazon, C=US4096-bit RSA (RSA_4096)
CN=Amazon Root CA 3, O=Amazon, C=USElliptic Prime Curve 256 bit (EC_prime256v1)
CN=Amazon Root CA 4, O=Amazon, C=USElliptic Prime Curve 384 bit (EC_secp384r1)

The default root of trust for ACM-issued certificates is CN=Amazon Root CA 1, O=Amazon, C=US, which offers 2048-bit RSA security. The other roots are reserved for future use. All of the roots are cross-signed by the Starfield Services Root Certificate Authority certificate.

ACM Certificate characteristics

Certificates provided by ACM have specific characteristics applied to them. If the certificate is imported into the ACM, the characteristics might not apply. The characteristics in public certificates are:

  • Domain Validation: ACM certificates are domain validated which is attached to the subject field of an ACM certificate. When an ACM certificate is requested, the organization must validate that they own, control, and manage all the domains specified in the request. Users can validate domain ownership by using email or via DNS.
  • Validity Period: The validity period for ACM certificates is 13 months or 395 days.
  • Managed Renewal and Deployment: ACM manages the process of renewing ACM certificates and provisioning the certificates after they are renewed. Automatic renewal can help organizations avoid downtime due to incorrectly configured, revoked, or expired certificates.
  • Browser and application trust: 
    ACM certificates are trusted by all major browsers, including Google Chrome, Microsoft Internet Explorer and Microsoft Edge, Mozilla Firefox, and Apple Safari. Browsers that trust ACM certificates display a lock icon in their status bar or address bar when connected by SSL/TLS to sites that use ACM certificates. Java also trusts ACM certificates.
  • Multiple domain names: Each ACM certificate must include at least one fully qualified domain name (FQDN), and users can add additional names if they want. For example, when users create an ACM certificate for www.encryptionconsulting.com, users can also add the name www. encryptionconsulting.net if they can reach their site using either name. This is also true of bare domains (also known as the zone apex or naked domains). That is, users can request an ACM certificate for www. encryptionconsulting.com and add the name encryptionconsulting.com.
  • Wildcard domain names: ACM allows users to use an asterisk (*) in the domain name to create an ACM certificate containing a wildcard name that can protect several sites in the same domain. For example, *.encryptionconsulting.com covers www.encryptionconsulting.com and images.encryptionconsulting.com.
  • Algorithms: A certificate must specify an algorithm and key size. Currently, the following public-key algorithms are supported by ACM:
    • 2048-bit RSA (RSA_2048)
    • 4096-bit RSA (RSA_4096)
    • Elliptic Prime Curve 256 bit (EC_prime256v1)
    • Elliptic Prime Curve 384 bit (EC_secp384r1)

Disadvantages of using ACM Certificate

  • ACM does not provide extended validation (EV) certificates or organization validation (OV) certificates.
  • ACM does not provide certificates for anything other than the SSL/TLS protocols.
  • Organizations cannot use ACM certificates for email encryption.
  • ACM allows only UTF-8 encoded ASCII for domain names, including labels that contain “xn--” (Punycode). ACM does not accept Unicode input (u-labels) for domain names.
  • ACM does not currently permit users to opt-out of managed certificate renewal for ACM certificates. Also, managed renewal is not available for certificates that organizations import into ACM.
  • Users cannot request certificates for Amazon-owned domain names such as those ending in amazonaws.com, cloudfront.net, or elasticbeanstalk.com.
  • Users cannot download the private key for an ACM certificate.
  • Users cannot directly install ACM certificates on their Amazon Elastic Compute Cloud (Amazon EC2) website or application. Users can, however, use their certificate with any integrated service.

Certificate Management

Prevent certificate outages, streamline IT operations, and achieve agility with our certificate management solution.

Services integrated with AWS Certificate Manager

AWS Certificate Manager supports a growing number of AWS services. Organizations cannot install their ACM certificate or their private ACM Private CA certificate directly on their AWS-based website or application.

  • Elastic Load Balancing: Elastic Load Balancing automatically distributes the organization’s incoming application traffic across multiple Amazon EC2 instances. It detects unhealthy instances and reroutes traffic to healthy instances until the unhealthy instances have been restored. Elastic Load Balancing automatically scales its request handling capacity in response to incoming traffic.
    In general, to serve secure content over SSL/TLS, load balancers require that SSL/TLS certificates be installed on either the load balancer or the back-end Amazon EC2 instance. ACM is integrated with Elastic Load Balancing to deploy ACM certificates on the load balancer.
  • Amazon CloudFront: Amazon CloudFront is a web service that speeds up the distribution of an organization’s dynamic and static web content to end-users by delivering their content from a worldwide network of edge locations. When an end-user requests content that they are serving through CloudFront, the user is routed to the edge location that provides the lowest latency. This ensures that content is delivered with the best possible performance. If the content is currently at that edge location, CloudFront delivers it immediately. If the content is not currently at that edge location, CloudFront retrieves it from the Amazon S3 bucket or web server that users have identified as the definitive content source.
    To serve secure content over SSL/TLS, CloudFront requires that SSL/TLS certificates be installed on either the CloudFront distribution or on the backed content source. ACM is integrated with CloudFront to deploy ACM certificates on the CloudFront distribution.
  • AWS Elastic Beanstalk: Elastic Beanstalk helps users deploy and manage applications in the AWS Cloud without worrying about the infrastructure that runs those applications. AWS Elastic Beanstalk reduces management complexity. Users upload their applications, and Elastic Beanstalk automatically handles the details of capacity provisioning, load balancing, scaling, and health monitoring. Elastic Beanstalk uses the Elastic Load Balancing service to create a load balancer.
    Yours must configure the load balancer for their application in the Elastic Beanstalk cons to choosing a certificate role.
  • Amazon API Gateway: With the proliferation of mobile devices and the Internet of Things (IoT), it has become increasingly common to create APIs that can be used to access data and interact with back-end systems on AWS. Users can use API Gateway to publish, maintain, monitor, and secure their APIs. After the user deploys their API to API Gateway, users can set up a custom domain name to simplify access. To set up a custom domain name, users must provide an SSL/TLS certificate. They can use ACM to generate or import the certificate.
  • AWS Nitro Enclaves: AWS Nitro Enclaves is an Amazon EC2 feature that allows users to create isolated execution environments, called enclaves, from Amazon EC2 instances. Enclaves are separate, hardened, and highly constrained virtual machines. They provide only secure local socket connectivity with their parent instance. They have no persistent storage, interactive access, or external networking. Users cannot SSH into an enclave. The data and applications inside the enclave cannot be accessed by the parent instance’s processes, applications, or users (including root or admin).
  • AWS CloudFormation: AWS CloudFormation helps users’ model and set up their Amazon Web Services resources. Users create a template that describes the AWS resources they want to use, such as Elastic Load Balancing or API Gateway. Then AWS CloudFormation takes care of provisioning and configuring those resources for them. Users don’t need to individually create and configure AWS resources and figure out what’s dependent on what; AWS CloudFormation handles all of that. ACM certificates are included as a template resource, which means that AWS CloudFormation can request ACM certificates that users can use with AWS services to secure connections.
    With the powerful automation provided by AWS CloudFormation, it is easy to exceed their certificate quota, especially with new AWS accounts.

Data Protection in AWS Certificate Manager

The AWS shared responsibility model applies to data protection in AWS Certificate Manager. As described in this model, AWS is responsible for protecting the global infrastructure that runs all of the AWS Cloud. Organizations are responsible for maintaining control over their content that is hosted on this infrastructure. This content includes the security configuration and management tasks for the AWS services that organizations use.

We recommend that organizations protect AWS account credentials and set up individual user accounts with AWS Identity and Access Management (IAM) for data protection purposes. That way, each user is given only the permissions necessary to fulfill their job duties. We also recommend that organizations secure their data in the following ways:

  • Use multi-factor authentication (MFA) with each account.
  • Use SSL/TLS to communicate with AWS resources. We recommend TLS 1.2 or later.
  • Set up API and user activity logging with AWS CloudTrail.
  • Use AWS encryption solutions, along with all default security controls within AWS services.
  • Use advanced managed security services such as Amazon Macie, which assists in discovering and securing personal data that is stored in Amazon S3.
  • If users require FIPS 140-2 validated cryptographic modules when accessing AWS through a command-line interface or an API, use a FIPS endpoint.

We strongly recommend that users never put sensitive identifying information, such as their customers’ account numbers, into free-form fields such as a Name field. This includes when users work with ACM or other AWS services using the console, API, AWS CLI, or AWS SDKs. Any data that they enter into ACM or other services might get picked up for inclusion in diagnostic logs. When they provide a URL to an external server, don’t include credentials information in the URL to validate their request to that server.

Certificate Management

Prevent certificate outages, streamline IT operations, and achieve agility with our certificate management solution.

ACM Private Key security

When users request a public certificate (p. 30), AWS Certificate Manager (ACM) generates a public/ private key pair. For imported certificates (p. 54), users generate the key pair. The public key becomes part of the certificate. ACM stores the certificate and its corresponding private key and uses AWS Key Management Service (AWS KMS) to help protect the private key. The process works like this:

  1. The first-time users request or import a certificate in an AWS Region, ACM creates an AWS managed customer master key (CMK) in AWS KMS with the alias AWS/ACM. This CMK is unique in each AWS account and each AWS Region.
  2. ACM uses this CMK to encrypt the certificate’s private key. ACM stores only an encrypted version of the private key; ACM does not store the private key in plaintext. ACM uses the same CMK to encrypt the private keys for all certificates in a specific AWS account and a specific AWS Region.
  3. When users associate the certificate with a service integrated with AWS Certificate Manager, ACM sends the certificate and the encrypted private key to the service. A grant is also created in AWS KMS, allowing the service to use the CMK in AWS KMS to decrypt the certificate’s private key.
  4. Integrated services use the CMK in AWS KMS to decrypt the private key. Then the service uses the certificate and the decrypted (plaintext) private key to establish secure communication channels (SSL/ TLS sessions) with its clients.
  5. When the certificate is disassociated from an integrated service, the grant created in step 3 is retired. This means the service can no longer use the CMK in AWS KMS to decrypt the certificate’s private key.

Request a public certificate using the console

To request an ACM general certificate (console):

  1. Sign in to the AWS Management Console and open the ACM console Choose request a certificate.
  2. On the request a certificate page, choose the request a public certificate and request a certificate to continue.
  3. On the Add domain names page, type their domain name. Users can use a fully qualified domain name (FQDN), such as www.encryptionconsulting.com, or a bare or apex domain name such as encryptionconsulting.com. Users can also use an asterisk (*) as a wild card in the leftmost position to protect several site names in the same domain. For example, *.encryptionconsulting.com protects corp.encryptionconsulting.com and images.encryptionconsulting.com. The wild card name will appear in the Subject field and the Subject Alternative Name extension of the ACM certificate.
  4. To add another name, choose to add another name to this certificate and type the name in the text box. This is useful for protecting both a bare or apex domain (such as encryptionconsulting.com) and subdomains such as *.encryptionconsulting.com).
  5. On the Select validation method page, choose either DNS validation or Email validation, depending on their needs.
    Before ACM issues a certificate, it validates that user own or control the domain names in their certificate request. Users can use either email validation or DNS validation. If they choose email validation, ACM sends validation emails to three contact addresses registered in the WHOIS database and five common system administration addresses for each domain name. Users or an authorized representative must reply to one of these email messages.
  6. On the Add Tags page, users can optionally tag their certificate. Tags are key/value pairs that serve as metadata for identifying and organizing AWS resources.
    When users finish adding tags, choose Review.
  7. If the Review page contains correct information about their request, choose Confirm and request. A confirmation page shows that their request is being processed and that certificate domains are being validated. Certificates awaiting validation are in the Pending validation state.

AWS Services by Encryption Consulting

Encryption Consulting provides AWS Data Protection Services, where we provide our expertise on scalability, cost-effectiveness, and ease of implementation. Amazon Web Services (AWS) is a leading cloud service provider with a wide range of services. It is estimated that 41.5% of total cloud users are consumers of AWS Cloud Services. Amazon has over 1 million users in 190 countries. One-third of internet users are estimated to visit a website using Amazon Web Services. With such a vast customer base and services, there is an imminent threat of data breach and loss.

Organizations utilizing AWS web services and applications are responsible for securing their sensitive and critical data stored in the cloud. Amazon Web Services (AWS) provides easy deployment and management of its IT operations; however, a challenge is that mistakes can happen and cascade to a more significant impact.

For instance, the misconfiguration of a data store can expose sensitive information such as personally identifiable information (PII), payment card industry (PCI) data, or protected health information (PHI).

A reputed marketing analytics company did not configure appropriate security controls on an Amazon Simple Storage Service (Amazon S3) within their AWS environment in a recent breach event. As a result of this misconfiguration of AWS, data related to 123 million households were leaked, including sensitive data such as home addresses, occupation, and mortgage information.

Encryption Consulting LLC will help your organization with its expertise in Cloud platforms and security services in deploying data protection controls in your AWS Cloud environment. Learn more about our services here. Also, you can read about a case study we did on Data Protection Service here.

Conclusion

AWS Certificate Management provides a way to manage SSL/TLS certificates easily and integrate those certificates in the AWS environment to keep devices, websites, and infrastructure secure. Even though regular SSL/TLS certificates are used, ACM has a few advantages and disadvantages, making it different from how SSL/TLS certificates have typically been used. In Encryption Consulting, we provide a detailed assessment and solution to organizations that make secure and scalable infrastructure while maintaining efficiency and keeping minimal cost.

References: