![](https://stories.isu.pub/63813227/images/10_original_file_I0.jpg?width=720&quality=85%2C50)
10 minute read
The bitcoin threat to security
from TEST - January 2019
by 31 Media
Over the last decade testers have made good progress in understanding application security, but few have appreciated the changing use of asymmetric cryptography
Testers have new opportunities to work in crypto-currencies and blockchains if they adequately understand security. But lurking on the horizon is the risk that unproven assumptions made in algorithmic and computational asymmetry face devastating destruction. In that scenario, crypto-skilled testers will be in huge demand to rebuild our world without reference to current paradigms.
Advertisement
THE EMPIRE
From the beginning of humanity until the 1970’s it was impossible to communicate secret messages without the risk of interception and decryption by eavesdroppers. The fundamental problem was the symmetric nature of encryption and decryption. Both the sender and receiver needed to use the same secret key, and somehow that key
had to be sent from the sender to the receiver before any encoded messages could be transmitted. If an eavesdropper obtained sight of the key they could then view and change the messages at will. By WW2 this had become a huge problem for armies, navies, and air forces communicating via thousands of wireless radios.
In October 1944 Walter Koenig at Bell laboratories wrote the Final Report on Project C-43. He suggested recipients of telephone messages could add interference to prevent eavesdropping, then subtract the noise from a recording to hear the message in the clear. Although this was technically impractical at the time, it theoretically removed the need for a sender to pass a secret key to the recipient. Twenty-nine years later Clifford Cocks read the report during his sixth week of work for GCHQ in
Cheltenham. As a brilliant Cambridge mathematician, he immediately realised this could be applied to cryptography.
Multiplying two large primes is easy, even if they are more than a hundred digits long. Factoring the numbers from the much larger product (i.e. working backwards from a semiprime to discover the primes) is very hard. Using the two prime numbers as a private key held only by the recipient, a public key (the semiprime product) could be passed to the sender to encrypt messages with an unbreakable cypher. If the public key was seen by an eavesdropper they would still be unable to decrypt the messages without the private key (the prime numbers held by the recipient and never transmitted). It took the freshly recruited spy about thirty minutes to come up with his prime number solution. The young mathematician’s discovery seemed immediately applicable to military communications, and it would become one of GCHQ’s most prized secrets.
Asymmetric (public key) cryptography had begun, but remained in the shadows of UK and US spy agencies. Then the discovery was repeated in 1976 by independent researchers. New Directions in Cryptography was published by
Whitfield Diffie and Martin Hellman at Stanford, with Ralph Merkle at UC Berkeley. Their ideas were progressed into a working solution using randomly chosen prime numbers over one-hundred digits long in 1977 by Ron Rivest, Adi Shamir, and Leonard Adleman (R.S.A.) from MIT.
The US National Security Agency (NSA) warned cryptographers that presenting and publishing their research could have legal consequences. The NSA also issued gag orders and tried to censor crypto research. And so the war began.
THE CYPHERPUNK REBELLION
Public-key cryptography provided unbreakable secret communications for the first time in history, but not for ordinary people. The relationship between citizens and state has always been highly unequal. A counter-culture movement began to emerge with strong interests in technology and promoting individual privacy. One member of the group, Judith Milhon, combined the popular term ‘cyberpunk’ with cypher (an algorithm for performing encryption or decryption) and created a new word ‘cypherpunk’ to describe her best friends.
Cypherpunk developments are driven by philosophy to rapidly produce revolutionary code and highly advanced hardware. We can see it works in practice, but there isn’t yet a theory to describe it. Several universities in the USA are currently researching the mining chip design, delivery, and processing speed phenomenon.
"We examined the Bitcoin hardware movement, which led to the development of customized silicon ASICs without the support of any major company. The users self-organised and self-financed the hardware and software development, bore the risks and fiduciary issues, evaluated business plans, and braved the task of developing expensive chips on extremely low budgets. This is unheard of in modern times, where last-generation chip efforts are said to cost $100 million or more." - Michael Bedford Taylor, University of California
"The amazing thing about Bitcoin ASICs is that, as hard as they were to design, analysts who have looked at this have said this may be the fastest turnaround time - essentially in the history of integrated circuits - for specifying a problem, which is mining Bitcoins, and turning it around to have a working chip in people's hands." - Joseph Bonneau, Postdoctoral research associate, Princeton University.
Cypherpunks objected to governments and large organisations treating individuals’ data as their own property to collect, analyse and use however they liked. In 1991, cypherpunk, Phil Zimmerman, released PGP (Pretty Good Privacy) to enable the general public to make and receive private email communications. In 1993 Zimmerman became the formal target of a criminal investigation by the US government for "munitions export without a license" because encryption was classified as a weapon.
On 9th March 1993, Eric Hughes published The Cypherpunk Manifesto. The closing statement provided the reason for cypherpunks to create bitcoin and blockchain: “The act of encryption, in fact, removes information from the public realm. Even laws against cryptography reach only so far as a nation's border and the arm of its violence. Cryptography will ineluctably spread over the whole globe, and with it the anonymous transactions systems that it makes possible”.
Before Bitcoin, 98 digital currencies were created and destroyed by attacks upon the central trust authority (e.g. imprisoning the owners and/or regulating their businesses out of existence), or by hackers ‘double spending’ the currency (copying the digital money file and respending it while corrupting the central authority).
There was clearly a need for a better digital currency that prevented doublespending and removed the attack target presented by central control.
THE GENESIS BLOCK
The root of trust for the Bitcoin blockchain is the genesis block (the first Bitcoin block in the first blockchain). Every block header references a previous block header hash to connect the new block to the previous block in an unbreakable and immutable blockchain. By summarising all previous transactions in the form of double-SHA256 hashes in the block header, a Merkle tree is built linking every block back to the Genesis block built on 3rd January 2009.
Bitcoin messages in transit and the transaction log are not encrypted because the security model is reversed from the traditional central control of trust. All Bitcoin nodes are responsible for establishing trust linked back to the Genesis block using a distributed peer-topeer consensus network. Data is visible ‘in the clear’ to enable validation by all nodes in the network.
HOW BITCOIN WORKS
A cypherpunk using the pseudonym ‘Satoshi Nakamoto’ used cryptography to circumvent the mistakes that had previously led to digital currency failures. He avoided the legislative and security vulnerability of having centralised control by introducing a decentralised peer-topeer network to verify transactions were valid and not ‘double spends’.
The consensus mechanism includes an ingenious adoption of a cryptographic hash process known generically as ‘proofof-work’. All variations on proof-of-work contain a decision point: Is the hash solution valid or invalid? The decision point is a test. This test is accurately executed at far higher speeds than any other computation in history (currently around 200 billion tests in the time a photon of light travels one metre, with exponentially increasing speeds every year).
To produce a valid Bitcoin block header hash and win the mining race, a miner needs to construct a candidate block filled with transactions, then use a Secure Hash Algorithm with 256-bit output (SHA-256) to calculate a hash of the block’s header that is smaller than the current difficulty target. In simplified terms, the more zeros required at the start of the hash, the harder it is to satisfy the difficulty target. It’s like trying to roll a pair of dice to achieve a score less than a target value. The probability of rolling less than twelve is 35/36 or 97.22%, while if the target is less than three, only one possibility out of every 36, or 2.78% will produce a winning result.
If the miner fails to produce a hash lower than the target they modify a variable header value known as a nonce (short for number used once), usually incrementing it by one, and retry. If they succeed, the miner includes the nonce that allowed the target hash to be achieved in their block header metadata. The nonce is then used by all the other nodes to quickly (in one operation) verify that nonce is the correct key to producing the target hash.
The crucial characteristic of proof-ofwork is computational asymmetry. Solving a proof-of-work problem must be hard and time consuming, but checking the solution is always quick, like a game of Sudoku. When advances in computer
processing power reduce the time taken to provide a solution, the difficulty (i.e. number of operations) to calculate a solution is increased. The validity of the harder solution can still be quickly checked, usually in one operation, even if billions of extra operations are added to finding the solution. Imagine playing Sudoku when the number of rows and columns are increased every time you learn how to solve the problems faster.
Satoshi Nakamoto saw the potential to use proof-of-work in machine-to-machine (M-2-M) testing and prevent invalid financial transactions being recorded in a ledger without the oversight of a trusted third party. There are many fascinating variations on M-2-M tests such as proof-of-stake and proof-of-useful-work, but so far these testing breakthroughs have excluded the efforts of professional testers due to the general lack of cryptography skills among testers.
In the case of Bitcoin, there is no central ledger. The Bitcoin ledger is distributed as a copy to every full node in the peer-to-peer network and each miner races to complete a proof-ofwork solution. Satoshi Nakamoto’s most important invention is the decentralised mechanism for emergent consensus.
Thousands of independent nodes follow a common set of rules to reach majority agreement built upon four processes, continuously executed by machine-to-machine testing:
1. Independent verification of each transaction received by any node, using a comprehensive-criteria shared by all nodes
2. Independent aggregation of transactions by mining nodes into candidate blocks, coupled with demonstrated computation through the proof-of-work algorithm
3. Independent verification of candidate blocks by every node and assembly into the chain of blocks. Invalid blocks are rejected as soon as any one of the validation criteria fails and are never included in the blockchain
4. The chain with the most cumulative computation demonstrated through proof-ofwork is independently selected by every node in the peer-topeer network. Once a node has validated a new block it attempts
to assemble a chain by connecting the block to the existing blockchain. The ‘main chain’ is whichever valid chain of blocks has the most cumulative proof-of-work associated with it.
MINING SPEED
By any other industry standard, the growth in Bitcoin mining performance is extraordinary:
• 2009 - 0.5 MH/sec to 8 MH/sec (x 16 growth)
• 2010 - 8 MH/sec to 116 GH/sec (x 14,500 growth)
• 2011 - 116 GH/sec - 9 TH/sec (x 562 growth)
• 2012 - 9 TH/sec - 23 TH/sec (x 2.5 growth)
• 2013 - 23 TH/sec - 10 PH/sec (x 450 growth)
• 2014 - 10 PH/sec - 150 PH/sec in August (x 15 growth).
Looking at the current Bitcoin hashing rate we should expect the incredible advances shown above to appear as huge spikes in a graph. Amazingly, everything earlier than mid-2014 appears to be a flat line compared to the recent exponential growth.
At the time of writing, Bitcoin is now performing up to 61,866,256,000,000,000,000 tests per second, or put another way, almost 62 billion tests per nanosecond. The atomic limitations of Application Specific Integrated Circuit (ASIC) chip design are now being approached decades earlier than expected.
AND NOW THE RISK OF DISASTER
There is however, a fundamental risk that could undermine proof-of-work and all variations used by blockchains. Proofof-work is totally dependent upon the existence of computational asymmetry. In Bitcoin, the level of effort to solve a problem (calculate a valid hash and provide the nonce) must be high, but the level of effort to check the solution (use the nonce to check the hash is valid) must be low.
To picture how asymmetric the current proof-of-work difficulty target is, imagine checking the winning calculation equals the physical volume of an amoeba, then calculating a solution by exhaustive searching is an equivalent volume to the whole planet earth. This level of difficulty
achieves a steady average addition rate of one new block onto the blockchain every ten minutes, while also preventing cheating.
The risk begins with the assumption that proof-of-work is a nondeterministic polynomial (NP) time problem. NP problems have the characteristic of being hard to solve yet quick to check the result. Jigsaw puzzles are NP problems. The only way to be sure a pile of jigsaw pieces builds a complete picture is to try fitting every piece into place. At the end of the task it is instantly obvious if the jigsaw is complete.
We shall follow the general assumption that proof-of-work is an NP problem. Now comes the biggest assumption of all, one that is implicitly made by all blockchains: P ≠ NP.
P represents polynomial complexity problems such as addition and multiplication, for which there exists a polynomial time algorithm that generates a solution. i.e. can be solved ‘quickly’. NP represents nondeterministic polynomial complexity problems such as Rubik’s cube and prime number factorisation, which consist of two phases: firstly, guess the solution in a non-deterministic way; secondly verify or reject the guess using a deterministic algorithm that is performed in polynomial time.
All P problems exist within the set NP, but no-one has been able to prove if P problems could be equal to NP, or definitely not equal to NP. The working assumption adopted by all blockchains is that P does not equal NP. While P vs NP is rarely discussed by testers, it is the greatest unsolved problem in computer science, and possibly all of mathematics.
The implications are so enormous the Clay Mathematics Institute set a $1 million prize to anyone providing a proof that either P = NP, or P ≠ NP. It is one of seven Millennium Prize Problems set on 24th May, 2000.
Anyone able to solve proof-of-work in polynomial time can avoid the cost and effort of working through all possible solutions by arriving at the target in a single step.
NP problems are like looking for a needle in a haystack, which conventionally requires looking through the entire haystack until the needle is found. A P = NP solution doesn’t require faster searching, it requires an approach that doesn’t involve searching at all.
Metaphorically speaking, a solution would be like pulling a needle from a haystack using a super-powerful magnet.
A mathematician may discover a new solution anytime, yet an increasing risk is the advance of quantum computing. Once computing steps beneath the nanometre scale and inside the atom, the rules change. We will have entanglement, interference, superposition and decoherence to consider. Most importantly, answers will not be in a binary state. It may become possible to return many, perhaps all possible answers simultaneously using Shor’s algorithm. This looks increasingly like a route to solving NP problems in polynomial time.
Experts in general purpose quantum computers and don’t expect them to replace classic computers for more than a decade and to be highly expensive. But let’s not forget the mining chip design, delivery, and processing speed phenomenon. Miners are financially incentivised to accelerate the development of quantum computers targeted at solving proof-of-work, and the ultimate miner would solve proof-of-work in polynomial time to collect newly issued coins at high frequency. From there it is a short step to disaster. When a miner can submit valid blocks to nodes with the correct SHA-256 header hash as fast as the blocks can be tested (i.e. in polynomial time), emergent consensus is defeated.
THE END IS NIGH?
There are wider ramifications. If P = NP, every public key cryptosystem we have such as RSA, Elliptic Curve Cryptography (ECC), Secure Socket Shell (SSH) and Transport Layer Security (TLS) all become solvable in polynomial time.
This would mean the end of privacy and secrecy as we know it.
The quantum cryptography era would also be the beginning of a new frontier for testing. In the scramble to reestablish a secure means of transmitting and storing data, any tester with an understanding of cryptographic schemes resistant to quantum computing, such as quantum key distribution methods like the BB84 protocol, and mathematicalbased solutions such as lattice-based cryptography, hash-based signatures, and code-based cryptography would be worth their weight in gold.
There might still be enough time for smart testers to prepare.