How Should Bitcoiners View Quantum Computing?

How Should Bitcoiners View Quantum Computing?

In early 2020, quantum computing came into the public spotlight as a potential threat to Bitcoin. Relying on the SHA-256 cryptographic hash function for proof-of-work network consensus, Bitcoin’s value depends on computational power.

If there is a technology that can circumvent the traditional binary system of 0s and 1s, there is the potential to upend cryptography as we know it. But is this risk exaggerated?

Could quantum computing one day turn Bitcoin into a worthless piece of code? Let’s start by understanding why Bitcoin relies on cryptography.

Bitcoin bits and hashes

When we say that an image is 1 MB, we are saying that it contains 1,000,000 bytes. Since each byte contains 8 bits, this means that the image contains 8,388,608 bits. As a binary number (bit), it is the smallest unit of information, either 0 or 1, that builds the entire edifice of our digital age.

In the case of an image, the bits in a 1MB file would assign a color to each pixel, making it readable by the human eye. In the case of a cryptographic function such as SHA-256 (256-bit Secure Hash Algorithm), developed by the NSA, it will produce a 256-bit (32 byte) fixed-length hash from an input of arbitrary size.

The primary purpose of a hash function is to convert any string of letters or numbers into a fixed-length output. This mixing of obfuscation makes it ideal for embedded storage and anonymous signatures. Because fragmentation is a one-way street, fragmented data cannot be effectively undone.

Therefore, when we say that SHA-256 provides 256-bit security, we mean that there are 2256 possible hashes that must be taken into account for reversal. When making Bitcoin payments, each Bitcoin block has its own unique transaction hash generated by SHA-256. Each transaction within the block contributes to this unique hash as it forms the Merkel rootIn addition to the timestamp, nonce value, and other metadata.

A potential blockchain attacker would have to recalculate the hash and Extract the necessary data Not just for that block containing the transactions, but for all subsequent blocks associated with it. Suffice it to say that the probabilistic load 2256 constitutes an almost impractical computational endeavor, requiring an enormous expenditure of energy and time, both of which are very expensive.

But is this no longer the case with quantum computing?

The new quantum model of computing

Moving away from qubits like 0 and 1, quantum computing introduces qubits. By taking advantage of the observed property of superposition, these units of information can not only be either 0 or 1, but both simultaneously. In other words, we are moving away from deterministic computing to non-deterministic computing.

Since qubits can exist in an entangled and superimposed state, even being observed, the calculations become probabilistic. Since there are always more states than 0 or 1, a quantum computer has the potential for parallel computing as it can process 2n states at the same time.

A classical binary computer would have to run a function for every 2n possible states, which a quantum computer could evaluate simultaneously. In 1994, mathematician Peter Shor developed an algorithm with this in mind.

Shor’s algorithm It combines the techniques of quantum Fourier transform (QFT) and quantum phase estimation (QPE). Speed ​​up finding patterns In theory, you can break all encryption systems, not just Bitcoin.

However, there is one big problem. If quantum computing is probabilistic, how reliable is it?

Coherence stability in quantum computing

When qubits are said to be superimposed, this is like imagining flipping a coin. While in the air, one can imagine that the coin has both states – heads or tails. But once it lands, the state is dissolved in one outcome.

Likewise, when qubits are resolved, their state collapses to the classical state. The problem is that a leading algorithm like Shor’s algorithm needs many qubits to maintain their superposition for a long period of time to interact with each other. Otherwise, necessary and useful calculations will fail to actually be completed.

In quantum computing, this refers to quantum decoherence (QD) and quantum error correction (QEC). Moreover, these problems must be solved across many qubits to perform complex calculations.

According to Millisecond coherence in a superconducting qubit paper Published June 2023 The longest coherence time for a qubit is 1.48 ms with an average gate accuracy of 99.991%. The last ratio indicates the overall reliability of the quantum processing unit (QPU).

Nowadays, the most widely used and powerful quantum computer appears to be from IBM, dubbed The second quantum system. A modular system ready for expansion, Quantum System Two should perform 5,000 operations using three Heron QPUs on a single circuit by the end of 2024. By the end of 2033, this should rise to 100 million operations.

The question is, will this be enough to materialize Schar’s algorithm and break Bitcoin?

Quality control threat survival

Due to decoherence and fault tolerance issues, quantum computers do not yet pose a significant risk to cryptography. It is unclear whether a fault-tolerant quantum system can be achieved on a large scale when such a high level of environmental purity is required.

This includes Electron and phonon scatteringPhoton emission and even electron-electron interaction. Furthermore, the more qubits necessary for Shor’s algorithm, the greater the decoherence.

However, although these may seem like intractable problems inherent in quantum computing, significant progress has been made in quality control methods. Example: River Delta Flow 2 The method performs real-time QEC on up to 250 bits. By 2026, this method should lead to the first viable quantum application with one million real-time quantum operations (MegaQuOp).

To break SHA-256 in one day, 13 million qubits would be needed, according to AVS Quantum Science condition It was published in January 2022. Although this would threaten Bitcoin wallets, a larger number of qubits, around a billion, would be needed to carry out a 51% attack on the Bitcoin mainnet.

When it comes to implementing Grover’s algorithm, which is designed to take advantage of quality control for searching unstructured databases (unique hashes), a Research paper Research published in 2018 suggested that no quantum computer would be able to implement it until 2028.

Image credit: Ledger Magazine

Of course, the hash rate of the Bitcoin network has increased dramatically since then, and quality control has to address lack of correlation as a major hurdle. But if QC roadmaps eventually turn into reliable quantum systems, what can be done to counter the QC threat to Bitcoin?

Resistance to quantum computing

There are several proposals to protect Bitcoin holders from quantum computers. Since a 51% QC attack is very unlikely, the focus is mainly on strengthening the wallets. After all, if people could not rely on their Bitcoin holdings to be secure, this would cause a mass exodus from Bitcoin.

In turn, Bitcoin prices will fall and the network’s hash rate will decrease dramatically, making it more vulnerable to quality control than previously expected. One such enhancement is the implementation of Lamport signatures.

Using Lamport signatures, a private key will be generated in pairs, 512 bit strings from 256 bit outputs. A public key with a cryptographic function will be generated for each of the 512 bit strings. Each BTC transaction will need a one-time Lamport signature.

Since Lamport signatures are not based on elliptic curves on finite fields in the Elliptic Curve Digital Signature Algorithm (ECDSA), which Bitcoin uses and can be exploited by the Shar algorithm, but on hash functions, this makes them a viable, quantum-resistant alternative.

The downside of Lamport signatures is their increased size, up to 16 KB, and their one-time use. Of course, simply changing addresses and keeping bitcoins in cold storage, thus avoiding exposure of the private key, can also prevent quality control from being effective.

Another way to confuse potential QC attacks is to implement network-based encryption (LBC). In contrast to ECDSA, LBC avoids finite patterns by relying on discrete points in n-dimensional mesh (grid) space that extends infinitely in all directions. Because of this feature, a quantum algorithm that can break LBC has not yet been developed.

However, to implement a new type of cryptography, Bitcoin must undergo a hard fork. In this scenario, several signals would likely be needed that major breakthroughs in quantum computing, especially in the number of qubits and fault tolerance, are imminent.

Bottom line

It’s safe to say that the Bitcoin mainnet itself is not in danger from quantum computing, either in the near or distant future. However, if QC compromises Bitcoin encryption – making SHA-256 and ECDSA obsolete – it will severely impact trust in the cryptocurrency.

This trust is crucial, as shown by major companies such as Microsoft and PayPal, which have adopted Bitcoin payments, which have been withdrawn by up to Savings of 80% compared to card transactionsno chargebacks, and full control of funds. With over 300 million holders globally, Bitcoin’s appeal as a secure asset and cost-effective payment option remains strong.

Ultimately, the value of Bitcoin is maintained by the capital and trust behind it. that it Historical fluctuations It shows how events – from Elon Musk’s tweets and PayPal integration to ETF launches and the FTX collapse – have affected market sentiment. The underlying threat to Bitcoin’s cryptography could lead to panic selling, miner withdrawals, and decreased mining difficulty, which could open the door to a 51% QC attack with fewer qubits.

To prevent such a scenario, Bitcoin holders and developers would do well to keep up with quality control developments.

This is a guest post by Shane Nagle. The opinions expressed are entirely their own and do not necessarily reflect the opinions of BTC Inc or Bitcoin Magazine.

BitcoinersComputingQuantumView