Artifact Signing: Dual, Post-Quantum/Traditional Hybrid Signatures and Downgrades
April, 2025Authors: Panos Kampanakis (AWS), Daniel Van Geest (CryptoNext Security)
Summary: This document discusses how new digital signatures (e.g. quantum-resistant) can be introduced in common artifact/software/firmware signing technologies. It addresses topics related to downgrades when using multiple signatures for backwards compatibility. It also covers how existing artifact signing standards can support hybrid solutions with multiple cryptographic families used together to ensure attacks against one do not render the whole signed artifact insecure. It concludes that existing standards can enable such uses, but software implementations may still need updates that allow signature verifiers to support dual signing, prevent downgrades and allow legacy verifiers to remain operational.
Introduction
Various standards have been defined to introduce digital signatures to digital assets for various use-cases. These include the CMS, UEFI, COSE and OpenTitan specifications. Digitally signatures ensure the integrity and authenticity of document, software, or firmware artifacts.
What happens if there is an issue with the security of a signing algorithm used for code signing? This question previously came up with the transition away from signatures that use SHA-1 as the digest. The collision resistance of SHA-1 was no longer holding which meant that signatures had to start using SHA-256 hashing. Legacy verifiers that did not support SHA-256 still had to work, but new verifiers had to use SHA-256. To achieve the goal, signers started adding two signatures to artifacts and verifiers used “OR logic” when verifying the signature. A legacy verifier that did not understand the new signature using SHA-256 still validated the SHA-1 one whereas an upgraded verifier verified the SHA-256 signature. Such dual signatures allowed for a migration to SHA-256 while legacy verifiers remained operational with SHA-1.
There is a caveat with dual signatures. It that has to do with downgrades. There are different types of downgrade risks against dual signed artifacts, with varying levels of risk:
- Signature Stripping on Unmodified Content: In this scenario, a signature is simply removed from the artifact without modifying the signed content. An adversary does not gain much advantage by doing that, however this action could still undermine trust in the message by falsely suggesting that a signer never approved it. Verifiers frequently consider unsigned messages as insecure and fail verification.
- Signature-Hash Second Pre-image Attacks: A second-preimage attack is when an attacker can forge a message that verifies with an existing signature generated from a legitimate message. Usually such attacks are possible with signing algorithms that use hash functions which have pre-images. In such scenarios, if a signed artifact contains multiple signatures, the attacker can strip the non-vulnerable signatures and modify the message. That could lead to a downgrade where the forged message passes verification with the signature that employs the insecure hash. This type of attack was a concern for SHA-1 based signatures during the transition to SHA-256; while it is not possible to find a message which can be verified by the signature of another message (SHA-1 second pre-image), the concern was enough to spur a transition.
- Key Recovery or Universal Forgery: In this scenario, the adversary can simply generate a new valid signature on any message. Typically, the adversary would need to recover the signer’s private key to do this. In the case of classical algorithms such as RSA and ECDSA, a cryptanalytically relevant quantum computer (CRQC) could recover a private key using Shor’s algorithm. If a second signature, not vulnerable to a CRQC, is included in an artifact along with the classical one, an adversary could remove this signature or even resign any artifact only with the recovered private key to achieve a Universal Forgery for any message.
In summary, an artifact signing standard that allows for multiple signatures can be susceptible to downgrades where the adversary strips the signatures it cannot break. To protect against downgrades, some standards have introduced counter signatures which basically sign twice sequentially (the second signature signs the first one). Other mitigations include adding a reference or binding digest value in a signature which points to the other ones in the artifact. That makes upgraded verifiers aware of the other signatures and prevents Signature-Hash Second Pre-image downgrades, but it does not prevent Universal Forgeries. The mitigation against Universal Forgeries includes new verification logic. A verifier that can understand the new signature will need to ensure it is present and passes validation. Legacy verifiers can still use “OR logic” and verify only the classical signature they understand although susceptible to forgeries.
New signature algorithms may sometimes be considered riskier than old, mature algorithms. This has led adopters to sometimes pursue what are called Post-Quantum/Traditional (PQ/T) Hybrid Digital Signatures, which include a classical signature (e.g. ECDSA) and new post-quantum signature (e.g. ML-DSA). A verifier that wants to ensure that the signature is secure even if there was an unknown issue with one of the two, would verify both signatures before passing verification altogether. To introduce PQ/T hybrid support in artifact signing, “OR logic” would need to be adjusted in verifiers. They could still need to enforce “OR logic” for algorithms they support, but when seeing a classical and a new post-quantum algorithm, they would need to enforce “AND logic”. In the case they saw a quantum-resistant signature like SLH-DSA which is based on mature primitives, verifiers could enforce “SLH-DSA only” logic and ignore the classical (e.g. ECDSA signature). These approaches assume that signers introduce two or more signatures in the artifact, and verifiers can enforce the desired verification logic.
Below we discuss how these issues are addressed in common signing standards and identify gaps where the standard or its implementation would need to change to support quantum-resistant signatures.
CMS
Cryptographic Message Syntax (CMS) is a standardized format that can sign, authenticate, or encrypt arbitrary message content. It is used in various use-cases including software and document signing. draft-ietf-lamps-cms-ml-dsa and draft-ietf-lamps-cms-sphincs-plus specify the use of new post-quantum algorithms.
IETF RFC5652, the CMS standard, supports multiple signatures of the signedData content in SignerInfo structures. It specifies “OR logic”, initially introduced in RFC4853, which allows for the verifier to pass verification by validating only one of the included signatures. That enables signing algorithm migrations where verifiers may not support all included signatures in the content. Thus, they could validate only one of them and not fail verification.
Although “OR logic” has existed in CMS for a long time, implementations may still have not adopted it. For example, OpenSSL, a very popular cryptographic library, implements CMS. Its CMS_verify() function attempts to verify all SignerInfo signatures in the signedData. In the case of a migration to new post-quantum signatures in CMS, a signer could add a classical and a quantum-resistant signature in the content, but if an OpenSSL-based legacy verifier that could not consume the post-quantum signature was attempting to verify all signatures, verification would fail. To enable a post-quantum migration in CMS, CMS implementations need to support the specification’s “OR logic”.
To protect against Signature-Hash Second Pre-image downgrades, CMS introduced a MultipleSignatures signed attribute in RFC5752. MultipleSignatures serve as a reference to other signature algorithms and values included in other SignerInfos in an artifact signed with multiple algorithms. Given that the MultipleSignatures attribute is signed, a miscreant cannot strip it to deploy a downgrade and force verifiers to verify only the signature of its liking. For example, an attacker attempting a downgrade of a dual signed message with RSA-with-SHA1 and RSA-with-SHA256 and MultipleSignatures, would need to find a second pre-image of the SHA-1 digest of the legitimately signed CMS attributes which include the content digest, the referenced signature algorithms and values in the MultipleSignatures. As such, downgrading such signed artifacts is impractical unless there is a real SHA-1 second preimage attack. Even without the MultipleSignatures attribute, it is arguably impractical to forge an artifact that verifies with the same signature as a legitimate one unless there is a practical second preimage on SHA-1 (used to digest the content). Unfortunately, Universal Forgery downgrades would be trivial in CMS, if a CRQC came to existence. An adversary with a CRQC could implement Shor’s algorithm and recover the classical signature private key. This means that they would be able to take legitimately signed CMS content, strip any quantum-resistant signatures or other fields and resign with the classical private key. Thus, MultipleSignatures would not achieve any downgrade protections against such scenarios.
In a post-quantum world, a new quantum-resistant algorithm like ML-DSA may sometimes be deployed in a PQ/T Hybrid fashion to hedge against the risk of unforeseen issues with the new algorithm. In the CMS context, PQ/T Hybrid signing means that the signedData includes two signatures in two separate SignerInfos, which will both need to be verified by the verifier. Thus, to implement PQ/T Hybrid signing in CMS, the “OR logic” would need to be replaced with “AND logic” when the upgraded verifier sees a classical signature and a post-quantum one with an algorithm that it does not consider “adequately mature”. If “AND logic” is not an option, another way of supporting PQ/T Hybrid signatures is to support composite signatures that concatenate the classical and the quantum-resistant signature and treat them as the quantum-resistant signature which upgraded verifiers must verify. In that approach, the classical and the post-quantum hybrid signatures are concatenated in the artifact and it is up to the verifier to validate them. Composite signatures can be treated as a new signature scheme and can offer a clear verification path to a trusted composite root, but they mean an additional transition to composite PKI between the classical and the pure post-quantum PKI.
On the other hand, if the post-quantum algorithm is considered mature (e.g. SLH-DSA), a verifier that requires quantum resistance could only validate that. A legacy CMS verifier that can only consume a classical signature would still remain operational by verifying only the algorithm it understands. Existing CMS implementations may not support such functionality.
CMS also offers a counter signature attribute which allows for a signer to provide a signature on another signature. That approach could provide PQ/T Hybrid functionality where the first signature is a quantum-resistant scheme which is subsequently signed by the classical algorithm. A legacy verifier would need to ignore the sub-signatures in the counter signature it does not understand by following the “OR logic” as specified in RFC5652. It is unclear if CMS implementations implement counter signatures and enforce “OR logic” as they are not commonly used. OpenSSL 3.4 did not support them at the time of this writing but git issue # 12576 discussed changes to generally support counter signature attributes correctly.
UEFI Signing
Unified Extensible Firmware Interface (UEFI) is a specification for the firmware / software architecture of platforms. It includes mechanism for signing and authenticating the booting sequence of the software loaded in various computer systems. UEFI enables algorithm migrations by supporting multiple signatures which can be verified independently.
This mechanism was useful when migrating from operating system (OS) signatures that used SHA-1 to using SHA-256. As firmware may sometime be hard to upgrade, legacy system firmware still had to verify the SHA-1 signature before loading the OS. Upgraded firmware verified the SHA-256 signature of the OS and then proceeded with loading it. To achieve such dual signing, the UEFI specification defines an Embedded Digital Certificates format which includes a Certificate Table data directory entry. The table includes bCertificate binaries which can consist of certificates and signatures in PKCS#7 SignerInfo structures. When validating an image, UEFI verifies its signatures against known trusted keys / root certificates and checks them against dbx, its negative list database, before adding them in its trusted signature db database.
So, EFI images can carry multiple signatures which allow for verifiers to validate the ones they support. That also allows for Signature-Hash Second Pre-image and Universal Forgery downgrades where an attacker can strip any signature it wants in order to force a verifier to load an image signed with an algorithm of its choice. The Appendix – Downgrade PoC in UEFI Dual Signatures includes a short Proof-of-Concept downgrade on a dual signed image. To prevent such risks, UEFI will need to introduce functionality in the verification which may require certain trusted signature algorithms.
UEFI Embedded Digital Signatures
If UEFI introduces support for ML-DSA, it may choose to support PQ/T Hybrid signatures to mitigate theoretical unknown risks in ML-DSA. To introduce such functionality, the UEFI specification would need to introduce new validation logic for verifying both signatures before adding them in the trusted db database. The classical and the post-quantum signature will be carried in SignerInfos in two different PKCS#7 signedData structures. A verifier would need to parse both signatures and verify them using the signer certificates in the SignerInfos. While checking against trusted signatures, UEFI would need to confirm both the classical and PQ are included in the db database. UEFI already supports dual signatures for such algorithm transitions. What it does not support is enforcing “AND logic” by a verifier that wants to combine the two signatures and pass verification only if both signatures pass. UEFI verifiers that need to support PQ/T Hybrid signatures without downgrades would need to introduce such new verification logic of dual signatures one of which is quantum-resistant. Alternatively, UEFI could adopt composite signatures that concatenate the classical and the quantum-resistant signature and treat them as the quantum-resistant algorithm that upgraded verifiers must verify. That way, UEFI verification would not need to be updated to support “AND logic” for PQ-hybrid signing.
The UEFI specification will also face another challenge with quantum-resistant signatures. Currently, the signer signs the digest of the content (image and associated data) and adds the signature in the image file. It also encrypts (with RSA) the digest of the content with the signer’s public key. The encrypted digest is carried in an encryptedDigest field in the PKCS#7 signedData. The purpose of this field is for the verifier to confirm the data signed has not been tampered with. Its usefulness is limited assuming the digest is collision resistant because signatures already sign the digest of the message. The newer CMS specification’s signedData does not include an encryptedDigest Llike PKCS#7 signedData. Quantum-resistant signature public keys cannot be used, like RSA, to encrypt digests of the signed message. Alternatively, we could use a post-quantum KEM with an asymmetric hybrid public key encryption (HPKE) scheme to achieve the goal but that involves lots of new functionality. As such, UEFI cannot continue to use encryptedDigest to encrypt the message digests with quantum-resistant signatures.
Additionally, the new quantum resistant signature specifications (e.g. ML-DSA, SLH-DSA) sign an internal digest which is not a plain hash of the message. ML-DSA includes the public key in a SHAKE256 calculation of the message which is called mu. SLH-DSA includes a random value along with public key information in the internal digest calculation which is called Hmsg. Thus, the typical digest carried in the PKCS#7 signedData calculated over various fields of the image format in UEFI does not lend itself for use with the new quantum-resistant signatures. Fortunately, PKCS#7 (RFC 2315) and CMS (RFC5652) can support signing a plain digest of some content, but they require the use of authenticatedAttributes (signedAttributes in CMS) which carry the digest. The DER encoding of the authenticatedAttributes are then signed with the quantum-resistant signature. For a better explanation, draft-ietf-lamps-cms-ml-dsa and draft-ietf-lamps-cms-sphincs-plus describe how SignedAttributes can be used with the new signatures in CMS respectively. To use digests in the new signatures, UEFI would need to switch to using authenticatedAttributes. What’s more, UEFI ought to follow the recommendations in draft-ietf-lamps-cms-ml-dsa and draft-ietf-lamps-cms-sphincs-plus to use digestAlgorithm functions of proper security with the new signatures.
Finally, UEFI uses PKCS#7 constructions which, although similar to CMS, are not fully interoperable. OpenSSL implemented PKCS#7 and is used in TianoCore EDK II, UEFI’s reference implementation. Any changes in the aforementioned PKCS#7 constructions used or verifications logic for UEFI will need to be supported by OpenSSL’s or other PKCS#7 implementations.
Some of these issues were brought up in Tianocore EDK II Git issue #10279.
COSE
CBOR Object Signing and Encryption (COSE) is specified in RFC8152. It is a small code and message size data format that provides signatures, message authentication, and encryption of arbitrary messages using Concise Binary Object Representation (CBOR) for serialization.
COSE, like CMS, supports multiple signatures in a COSE_Sign object and it references the same “OR logic” as CMS RFC5652. That would enable migrations where legacy verifiers validate a classical signature they understand, and upgraded verifiers validate the new signature. At the same time, it means a miscreant could strip signatures from the object to achieve a Signature-Hash Second Pre-image or Universal Forgery downgrade. RFC8152 refers to RFC5752 for multiple signature evaluations and signature bindings but, as explained above, these bindings do not offer significant security when one of the algorithms collapses like classical asymmetric cryptography does against a CRQC implementing Shor’s algorithm. To protect against downgrades, an upgraded verifier would need to enforce new logic where it must validate a new (quantum-resistant) signature.
Alternatively, COSE also supports a ‘crit’ header which represents critical parts of the structure that must be processed. A new (quantum-resistant) signature could be marked as critical by the signer to ensure that it is processed while verifying, but that could mean that legacy verifiers which cannot consume the signature would not be able to pass verification in a dual signature scenario.
To support PQ/T Hybrid verification, an upgraded verifier would have to use “AND logic” to validate both the classical and the post-quantum signature. As another way to introduce PQ-hybrid support, COSE could use composite signatures that concatenate the classical and the quantum-resistant signature and treat them as the quantum-resistant scheme that upgraded verifiers must verify.
COSE also offers counter signatures which could introduce PQ/T Hybrid signing where an artifact is signed with quantum-resistant algorithms and a counter signature signs that with a classical algorithm. RFC9338 specified a Countersignaturev2 option for COSE. The original document (RFC8152) specified a Countersignature which operates slightly differently. In either case, counter signatures could work if legacy
COSE verifiers ignore them when they do not support the new (quantum-resistant) algorithm. Legacy verifiers failing verification due to the quantum-resistant signature would introduce backwards compatibility problems.
Some of these topics were discussed in the IETF COSE WG.
OpenTitan
OpenTitan is an open-source design and guidelines for hardware root of trust (RoT) chips. It is used by vendors that build hardware for authentication applications which include code signing and verification. Some OpenTitan chips ship with common hardware vendors today.
The OpenTitan project uses its own data format. At the time of this writing, it was introducing new post-quantum signature functionality that incorporates two signatures and uses “AND logic” to verify the classical and the quantum-resistant one for conservative security. The OpenTitan Signing Guide documented how to attach multiple signature in binaries. OpenTitan’s git issue #22938 discussed PQ/T Hybrid signing and the choice OpenTitan made to validate both signatures.
OpenTitan’s verification configuration is stored in fuses and is applied at manufacturing time. As such, if a booted manifest does not include the expected signatures, booting will fail. Thus, the downgrades discussed in this document are not possible in OpenTitan’s architecture. The topic was discussed in OpenTitan Git issue #22938.
Acknowledgements
We want to thank Shahram Jamshidi from Altera, Peter Jones from Red Hat, Falko Strenzke, Robert Elliot from HPE, and Russ Housley for their valuable feedback.
Appendix – Downgrade PoC in UEFI Dual Signatures
Below we show how to take an already signed EFI image signed with two signatures, strip the second one which we assume to be with a secure signing algorithm and leave the first which is assumed to be insecure. This would be a typical downgrade attack. If a CRQC existed, an example of an insecure algorithm would be an RSA or ECDSA signature which could be broken with Shor’s algorithm and a secure one would be ML-DSA or SLH-DSA. For the demonstration, we use the sbsigntool, a popular tool that can manipulate signatures on UEFI binaries and drivers.
# The example dual signed EFI image is signed-example.efi.
# An adversary wants to remove the second signature which we assume is secure.
# First it removes the insecure signature which we assume appears first in the artifact and saves it in a DER encoded file.
sbattach –detach insecure-sig.der –remove signed-example.efi
# Then it adds back the insecure signature to appear second in the artifact.
sbattach –attach insecure-sig.der signed-example.efi
# And removes the secure signature which appears first.
sbattach –detach secure-sig.der –remove signed-example.efi
# The attacker has implemented a downgrade and can now make verifiers pass verification of an insecurely signed EFI file.