Donjon | 02/26/2026
Quantum Computing’s Threat to Blockchain: The Enduring Need for Secure Keys
As quantum computing advances, blockchain systems and hardware wallets must adapt to new signature algorithms and security constraints.
Key Takeaways:
- Quantum computers threaten current security models, enabling private-key recovery from exposed public keys.
- The threat isn’t immediate, but the “harvest now, decrypt later” scenario creates long-term vulnerability.
- PQC (Post-Quantum Computing) signature standards exist, but bring trade-offs: larger signatures, higher memory use, and more complex implementations.
- Hardware wallets must run PQC inside constrained Secure Elements, making RAM limits and side-channel protection major challenges.
- Migration must be coordinated, or fragmented adoption will introduce new operational and security risks.
Quantum computing is rapidly advancing, bringing with it the potential to revolutionize various industries. However, this powerful technology also poses a significant threat to existing cryptographic systems, especially those underpinning blockchain ecosystems.
While the focus often shifts to the quantum threat, it’s crucial to remember that the fundamental necessity of securing private keys remains paramount. Furthermore, hardware wallets, while offering enhanced security, come with their own set of constraints, and certain cryptographic algorithms are better suited for their limited computational capabilities, even if they are more resource-intensive for blockchain nodes.
We offer an overview of these constraints, possibilities, and preferences for the future of hardware wallets. But first, as a reminder of why post-quantum cryptography matters for hardware wallets, let’s examine how exactly quantum computing threatens today’s blockchain cryptography foundations.
The Quantum Threat to Blockchain
Blockchain technology relies heavily on cryptographic primitives, primarily public-key cryptography, to secure transactions and ensure network integrity. Quantum computers, with their ability to solve certain mathematical problems much faster than classical computers, could potentially break these cryptographic foundations. Specifically, algorithms like Shor’s algorithm threaten the elliptic curve cryptography (ECC) widely used in blockchain for authorizing transactions through digital signatures such as ECDSA or Schnorr/EdDSA. If a powerful enough quantum computer emerges, it could theoretically derive private keys from public keys, allowing attackers to compromise wallets and manipulate blockchain transactions.
This threat is important but not immediate: practical quantum computers capable of breaking current cryptographic standards are still some time away. Still, the “harvest now, decrypt later” scenario could be a concern, in which encrypted data is collected today with the intention of decrypting it once quantum computers are mature enough. From a blockchain perspective, this is not directly the case, as there is nothing to decrypt per se.
However, similarly, the blockchain already performs the “harvest” part publicly, and any output whose public key is openly recorded in the blockchain (for example, bitcoins from the “Satoshi era” which used bare public keys as addresses, or addresses that have been reused and for which a spend has already occurred, revealing the public key) will be exposed, making its associated private key vulnerable. Estimates vary, but generally speaking, around 25% of the Bitcoin network (by value) has its public key exposed and would be immediately vulnerable to a quantum attack.
Given these potential risks, even if they are not imminent, the industry cannot afford to wait until quantum computers are operational before updating its cryptography.
Preparing for Post-Quantum
The good news is that the cryptographic community has been preparing for years.
The NIST standardization process for post-quantum algorithms started in 2016, and is now entering its final stages. It is selecting new signature schemes designed to resist quantum attacks.
At the time of writing, two main families of post-quantum signature schemes stand out, each with distinct benefits and drawbacks:
- Hash-based: Their design is relatively straightforward, relying only on the well-understood security properties of cryptographic hash functions. This makes their security highly trusted, even in a post-quantum world. However, these schemes often suffer from very large signatures, and they lack the mathematical structure needed for more advanced cryptographic constructions such as homomorphic key derivation à la BIP-32 or threshold signatures.
- Lattice-based: Lattice cryptography, on the other hand, offers compact keys and signatures and enables more versatile designs thanks to its rich mathematical structure. These algorithms are generally more efficient, especially in verification, and are well suited for large-scale or embedded deployments. However, their security assumptions are more recent and less battle-tested than those of hash-based schemes, and their implementations can be complex — for example, Falcon relies heavily on floating-point operations.
In Europe, the French cybersecurity agency ANSSI has expressed similar views. In its 2023 technical note on quantum-safe cryptography, ANSSI highlighted lattice-based schemes as the most promising approach for the medium term and stated it is closely following the NIST process and academic research.
Each of these selected algorithms comes with trade-offs, so let’s take a closer look at the three schemes that have been selected as of today:
- SPHINCS+ is a stateless hash-based signature scheme.
It was standardized under the name SLH-DSA in FIPS-205. Its main advantages are short public keys (32 bytes) and that security relies on standard properties of hash functions, which are well-known and well studied today. However it has very large signatures (8 to 17 kB depending on the security level), slow signature generation, and high RAM usage making it less than ideal for embedded systems and distributed systems like blockchains. We will therefore not include it in this study. - Falcon is a signature scheme based on so-called “NTRU lattices” and Fast Fourier sampling.
It is expected to be formally standardized under the name FN-DSA soon. This means there is no official implementation as of date, and the current implementations do not guarantee forward compatibility (although the standardized version should not deviate too much from the one submitted to the selection process). Its main advantages are fast signature verification and relatively small key and signature sizes (albeit still much bigger than in classical cryptography). Its main drawback is the extensive use of floating-point operations resulting in complex key generation and signature. - Dilithium, also known by its standard name ML-DSA, is a signature scheme based on so-called “module lattices”.
It was officially standardized by the NIST in FIPS 204. Its main advantage is relatively fast signature and key generation thanks to well-known modular arithmetics. However, it has larger public key and signature sizes and higher RAM usage compared to Falcon.
As hardware wallet developers, we must now evaluate how these new algorithms behave in constrained environments and what their integration would imply for secure and efficient implementations. Note that in a blockchain context, the signature authorizing a transaction is generated once by the user spending the funds, while every node in the network must verify it. This setup suggests that optimizing verification cost over signature computation cost might be beneficial. However, this approach complicates the implementation of signature computation within a Secure Element. Therefore, striking a careful balance between these two factors is essential.
Hardware Wallets: Benefits and Constraints
Hardware wallets are widely recognized as the most secure way to secure crypto. They keep the keys offline, isolated from internet-connected devices, thereby significantly reducing the risk of online theft. Transactions are signed within the device itself, and only the signed transaction is broadcast to the network. However, when considering PQC (“Post-Quantum Cryptography”) algorithms for hardware wallets, there’s a trade-off:
- Algorithms Better Suited for Hardware Wallets:
Certain PQC algorithms, particularly lattice-based cryptography like ML-DSA and ML-KEM, are being considered for their efficiency on resource-constrained devices, because their core computations are largely lattice arithmetic. This arithmetic is generally straightforward to implement efficiently on small processors, which can make these schemes a good fit for hardware wallets. - Increased Burden on Blockchain Nodes:
While these algorithms might be efficient for the hardware wallet itself, they have larger key and signature sizes that can lead to increased transaction sizes. This, in turn, can place a greater burden on blockchain nodes in terms of storage, bandwidth, and processing power required to validate and propagate these larger transactions. This is a crucial consideration for the scalability and decentralization of future quantum-resistant blockchain networks.- While these post-quantum algorithms may operate efficiently within hardware wallets, they also introduce challenges that could limit their practicality for blockchain systems. A key principle of blockchain decentralization is that running a node should be simple and accessible to as many participants as possible. However, post-quantum algorithms typically require larger keys and signatures and involve more complex computations, both of which can significantly increase the cost and resource demands of node operation. Achieving a smooth transition to post-quantum blockchains will therefore require coordinated efforts across the community.
To move from theory to practice, we must now evaluate how these algorithms behave on actual embedded hardware. Benchmarking can provide insights into memory requirements, cycle counts, and feasibility under real-world conditions.
Benchmarking PQC Algorithms on Embedded Hardware
When each algorithm is proposed and especially after it is accepted, a series of benchmarks is run to test its performance for various metrics. The results are usually published in the final draft (and/or subsequent papers) of the algorithm proposal. To validate our understanding and assumptions, we first reproduced these reference benchmarks on a traditional microcontroller. Using a general purpose MCU (Microcontroller Unit) widely used in the electronic industry (specifically the STM32F439ZI) running at 24 MHz (as commonly done in the literature), we confirmed the general performance trends reported by others.
We then extended our analysis to a more security-relevant environment: a Secure Element, the security core of Ledger’s devices. Usually, Secure Elements come with dedicated cryptographic accelerators for performance and security reasons. However, as PQC is still quite recent, current Secure Elements do not embed any PQC accelerator, so we implemented the tested algorithms fully in software.
We decided to focus this first study on memory usage, as it is the most constraining metric for embedded PQC. Indeed, Secure Elements usually come with very limited amounts of RAM, and PQC algorithms, as previously discussed, are known for very large object sizes.
Below are the preliminary memory usage results we obtained on the ST33K Secure Element for the two currently standardized lattice-based algorithms, ML-DSA and Falcon (soon to become the FN-DSA standard).
Note: results are reported with the secure element’s generic countermeasures disabled.
Memory Usage (in bytes)
| Operation / Algorithm | Falcon-512 | Falcon-1024 | ML-DSA-44 | ML-DSA-65 | ML-DSA-87 |
| Key Generation | 14,420 | 27,552 | 46,320 | 5,712 | 5,712 |
| Signing | 32,396 | Not enough RAM | 6,056 | 7,600 | 9,128 |
| Signature Verification | 5,268 | 5,364 | 4,088 | 4,088 | 4,088 |
Execution time (in milliseconds – averages)
| Operation / Algorithm | Falcon-512 | Falcon-1024 | ML-DSA-44 | ML-DSA-65 | ML-DSA-87 |
| Key Generation | 1,094 | 3,735 | 533 | 1,046 | 1,780 |
| Signing | 835 | N/A | 1,096 | 2,309 | 4,383 |
| Signature Verification | < 100 | N/A | 437 | 808 | 1,490 |
A couple of points are worth noting to understand these results. First, for both Falcon and ML-DSA, there are multiple (2 and 3 respectively) security levels, each level coming with different object sizes and thus different computation costs. This is not to say that the lower levels are not secure; they are all considered secure for the foreseeable future. We tested all of them, as they are all susceptible to be used in various applications.
In the case of Falcon, the implementation tested is a non optimized, plain C version which, we believe, has potential to be made lighter and/or faster. As for ML-DSA, we tested against ZKNox’s low ram implementation (which follows techniques provided by this paper from NXP). It allows running the algorithm with a substantially reduced memory footprint at the cost of slower execution speeds.
That said, we are already observing substantial memory usage, approaching the limits of a 64 KB Secure Element, in the case of Falcon. And for ML-DSA we are observing execution times that would really hinder user experience, with signing times sometimes exceeding ten seconds.
These early results help us better understand the engineering challenges and optimization trade-offs involved in bringing PQC to secure embedded systems.
Security Considerations for Embedded PQC
Benchmarking performance is only one part of the picture. In secure hardware environments, especially in the context of hardware wallets, implementations must meet strict security requirements. Even if a cryptographic algorithm is theoretically secure against quantum attacks, its resistance to real-world threats (such as side-channel analysis or fault injection) depends heavily on the way it is implemented.
Secure elements operate under adversarial models that go far beyond typical software environments. They require:
- Constant-time execution, to avoid leaking sensitive information through timing variations.
- Masking techniques, to randomize internal computations and thwart power analysis.
- Fault detection and tolerance, to resist attacks that induce errors and exploit faulty signatures.
- Shuffling and jitter, to randomize execution order and introduce timing variability, making side-channel measurements harder to align and average
These protections are not optional. They are fundamental requirements when manipulating long-lived cryptographic keys.
Hardware-specific Risks and Progress
These requirements have different implications for Falcon and ML-DSA, due to their underlying mathematical structures.
Falcon, which relies heavily on double-precision floating-point operations, will be difficult to implement in a constant-time manner. Consequently, it will require either emulated floating-point operations, which are very slow, or a per-chip analysis, which will be complex and brittle. Moreover, there is currently no official implementation designed specifically to counter side-channel or fault attacks in embedded devices.
For ML-DSA, a masked version has been proposed to enhance side-channel resistance. However, research has revealed remaining weaknesses, and countermeasures typically increase the cycle count by about 33%.
In summary, implementing PQC in hardware wallets will require not only selecting the right algorithm, but also carefully designing the embedded code to meet strong security guarantees.
Despite these challenges, progress is being made not only by manufacturers, but also by the broader developer community.
Community contributions through Legder’s Open Platform
Ledger’s unique open-platform model has allowed its broader community to contribute and innovate over the years in an environment powered by Secure Elements. Post-Quantum cryptography, especially in the Blockchain industry, is the topic of many discussions and experiments and the collaborative platform provided by the Ledger stack will bolster that effort.
In fact, we’ve already seen an example of this, with ZKNOX coming up with a new ML-DSA implementation for constrained embedded devices, that allows this PQC algorithm to run in a Ledger App. With this, they’ve successfully showcased the first end-to-end, post-quantum, hardware wallet signed transaction on the Ethereum blockchain. This prowess, in addition to their recent EIP proposal, is the type of innovation supported by the Ledger Open Platform.
Conclusion
The advent of quantum computing presents a significant, albeit not immediate, threat to the cryptographic foundations of blockchain. However, the fundamental principle of securing private keys will always remain paramount. Hardware wallets offer a robust solution for key storage but require careful consideration of their constraints.
As we transition to a quantum-resistant future, the selection of appropriate PQC algorithms will involve balancing the computational limitations of hardware wallets with the processing and storage requirements of blockchain nodes. At the same time, we must preserve what the ecosystem does best today: a near‑universal consensus on elliptic‑curve signatures, predominantly secp256k1. This uniformity makes audits reusable, libraries mature, and operations predictable. The real danger during a transition is fragmentation—chains, wallets, and exchanges choosing divergent algorithms, parameters, encodings, and address/descriptor formats, which would multiply the work for engineering and security reviews.
The path forward demands continuous research, development, and a proactive approach to security across the entire blockchain ecosystem — one that we will definitely be part of.
Yannick SEURIN & Alain MAGAZIN
Security Engineers
References
¹ https://eprint.iacr.org/2025/123.pdf
² https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.204.pdf
³ https://csrc.nist.gov/CSRC/media/Events/Second-PQC-Standardization-Conference/documents/accepted-papers/kannwischer-pqm4.pdf