One of the few actual innovations in Nakamotoan cryptocurrencies is the “consensus protocol”, which is a fully distributed algorithm to solve the classic Byzantine agreement problem . The challenge is to reach agreement, or at least consensus, when there are many messages and it is impossible to know which messengers are honest and which are dishonest.
Emperor Nakamoto’s new clothes involve broadcasting proposed updates to all participating nodes. Each node checks the validity (using checksums), and accepts an update that agrees with its own history. Basically, all participating nodes “vote” on the results, and in the event of alternative proposals, the one with the most votes is taken as the consensus of the network.
It should be clear that this simple protocol is also extremely conservative with a small ‘c’. Any record that is accepted by this consensus protocol is surely well supported. I will have been confirmed thousands of times over, eventually, by all nodes. (This is a fine semantic point, because the definition of “participating” is that you accept the consensus up to now, which is somewhat circular logic. Everyone who counts is in agreement because only those who agree really count.)
It should also be clear that Nakamoto’s approach does not scale well. The number of messages and decisions is linear with the size of the network, and the network is intended to be very large to make cheating difficult. (To dictate the result you need 51% of the votes—which is more difficult when there are a large number of votes.)
The upshot is that classic Nakamotoan consensus is very expensive and takes a long time, and becomes more expensive and slower as the network grows. In short, Bitcoin is isn’t scalable, and probably isn’t sustainable.
(This result is no surprise to anyone who has studied computer science. As a matter of fact, you can learn a lot in college, if you stick with it and take it seriously.)
This summer researchers at Ecole polytechnique fédérale de Lausanne (EPFL) report a suite of probabilistic algorithms that could replace Nakamotoan consensus . The basic idea is to use a probabilistic sample of nodes, rather than all of them. Just as probability sampling can reliably get very near the result of a complete canvas, these algorithms make it possible to achieve confidence in a blockchain from only a fraction of the whole network each vote.
These Byzantine algorithms scale as the square root of the number of nodes, and use negligible computation and power resources . Clearly, you could make a better Bitcoin with these algorithms. It would be just as secure, just as decentralized, and way more sustainable, with way less latency. So, as Charles Q. Cho and others imply, this could be a “new alternative to Bitcoin”.
This technology joins many other variations on Nakamoto’s ideas, including permissioned blockchains, zero-knowledge blockchains, and zillions of alt-coins.
The question is, would this new thing be “Bitcoin”, or something else? It would do the same thing, just as the plethora of cryptocoins and blockchains do. But could you still call it “bitcoin”?
Some enthusiasts might well want a better engineered Bitcoin. We’ve seen many proposals for “2.0”. But experience has shown that something this basic would not be supported by many Nakamotoans (e.g., this, this, this, this).
There are many reasons for this resistance, most of them non-technical.
First, Nakamoto (2009)  is scripture, it is the very definition of what Bitcoin is. Whatever these Byzantine protocols are, they simply aren’t Nakamotoan. (Though, Nakamoto’s protocol is probably a degenerate case of the general Byzantine Reliable Broadcast family.)
Second, the probabilistic protocols are complicated and require a certain level of “trust” in the mathematics and the laws of chance. Nakamoto’s simple, brute force approach is easy to understand and requires little math to believe in its correctness. For those concerned with “trust”, it may be difficult to lean on such relatively difficult math. (What if those sneaky Swiss guys are pulling a fast one, and there is a back door for “the man” to secretly control the results?)
Third, this protocol would surely scramble the mining economy, at least in the short run. I think it would come out with similar results for everyone, but a lot of current investments would probably be misplaced, and current business models upset. There is little chance that miners would agree to such a radical reworking of Bitcoin, even though it would probably benefit everyone in the long run.
And finally, there is an intangible value in the inefficiency of Bitcoin. For those who view Bitcoin as “virtual gold”, it is psychologically good for Bitcoin to be expensive and inconvenient, just like gold is expensive and inconvenient. For that matter, gold bugs are happy with poor scaling and long latency. This keeps Bitcoin “scarce” and therefore, in this mindset, “valuable”.
So this Swiss study joins many other schemes for how you might redo Bitcoin to get a better system. However, it is much more likely to become a competitor to Bitcoin than to be incorporated into the Nakamotoan Empire.
- Charles Q. Choi, New Alternative to Bitcoin Uses Negligible Energy, in IEEE Spectrum – Energywise. 2019. https://spectrum.ieee.org/energywise/computing/software/bitcoin-alternative
- Rachid Guerraoui, Petr Kuznetsov, Matteo Monti, Matej Pavlovic, and Dragos-Adrian Seredinschi, Scalable Byzantine Reliable Broadcast (Extended Version). arXiv arXiv:1908.01738 2019. https://arxiv.org/abs/1908.01738
- Satoshi Nakamoto, Bitcoin: A Peer-to-Peer Electronic Cash System. 2009. http://bitcoin.org/bitcoin.pdf