[A long and somewhat technical comment on the Bitcoin fork.]
The cryptocurrency community is apparently able to shake off a continuous stream of arrests, and naturally sees China’s economic woes as Exhibit A for why you want to use Bitcoin. Sadly Bitcoin exchange rates are crashing, which suggests that Bticoin is not considered a safe asset by “the markets”.
But the most interesting story since the Nakamoto Document is the massive code fork, which is more like a religious schism than a political coup. As I remarked last week, such a software fork is almost always a very bad thing for developers and businesses, making engineering more complex and error prone, and making planning difficult. Building your product today, you have to guess which software will still be used next year.
The technical issues appear to be fundamental and basically intractable. (Grayce Caffyn has a useful summary at Coindesk.) At its heart, Bitcoin seeks to be a real time transaction based system, where “real time” means the latency is “reasonable” for humans. If you try to buy something with Bitcoin, waiting a few seconds “feels right”, but waiting an hour is too long.
Second, Bitcoin aspires to support a global, anyone-to-anyone transaction system (in real time). There are potentially a lot of anyones on the planet, so this is potentially a lot of transactions. As an exercise, suppose that everyone on the planet used Bitcion once per day. That’s an average of 80,000 transactions per second. This is a tall order, and, of course, the number of transactions in the world economy is much, much larger than that.
Third, Bitcoin is supposed to be “decentralized”, and secured in part by the participation of millions of independent processors, including relatively small Mom and Pop operators. If Bitcoin processing becomes dominated by a relative handful of large operations, it is open to manipulation and generally no better than the conventional systems it seeks to displace. For this reason, the Bitcoin software and protocols are open, and have generally tried to be designed to not only be fair to all players, large and small, but to be feasible for at least modest sized computers.
This last goal has taken a beating in the last few years, as a technical arms race has seen the growth of extremely large nodes and pools of nodes, which have overwhelmed the ability of average Joe to really participate individually.
So, the Bitcoin software requirements amount to “infinitely fast, infinitely many transactions, executable on any computer and network”. Does anyone see a problem here?
Actually, this is a very common set of software requirements, everyone always wants infinite capability and zero cost.
In the specific cast of the August Fork Event (which could be a good name for a band), the arguments center on the core distributed data structure, the Blockchain. This data structure is the core of Bitcoin in that it is what everyone shares. So we have to agree on it, or the game is over.
Each transaction must be entered into a block on the Blockchain, and each block contains, you guessed it, a block of records. How big should a block be? How big should batches of transactions be? There is no one right answer, it depends on what you are trying to do and details of the networks and computers. “Big” is relative to the speed of the network and to your memory and processor, so it changes over time and may mean something different to you or me.
Batching up transactions creates various engineering tradeoffs. Collecting a bunch of transactions in a batch is more efficient for the network (shipping one big chunk rather than a lot of little ones), and the network is usually the bottleneck. But collecting a batch means delay for some transactions, until the train is full. So you probably want to process a batch every so often (every few seconds), even if not completely full.
The other issue is that if blocks are allowed to be really big, then they take a long time to pass around, and a lot of memory to handle them. This is an important issue because Bitcoin “mining” is a race to process incoming (blocks of) transactions. Every participating node must be able to handle every block, and to be able to process it reasonably quickly. Even setting aside the competitive nature of mining, transactions need to be confirmed pretty quickly, or Bitcoin cannot be used for payments.
The big problem that led to the fork flap, is that the current protocol limits blocks to 1MB, which is small enough that most networks and computers can handle them. But the rub is that there are so many transactions happening that there soon will not be enough room in the blocks to immediately enter all the records. Transactions will get stuck in queues waiting for an empty subway car. This is really bad, and both theory and practice tell us that this will lead to a melt down of the whole system, with infinite wait times. Game over.
Stepping back for a minute, let’s consider how conventional data systems handle these challenges. (Bitcoin certainly did not invent these probems!) First, most “centralized” systems are organized as federations of nodes, and you usually interact with some local pool of servers, not with any arbitrary server in the world. Behind the scenes, the federation cooperates to pass your transaction to other nodes as needed. All that part of the network is definitely not your Mom and Pop home computer or network.
Second, the systems use a lot of caching and buffering to reduce latency. There may well be really large blocks of data, which is feasible because the systems are specialized and expensive. They are also complex, and surely out of the reach of ordinary people or even most corporations.
In other words, the goal of democratic decentralized processing limits the use of many standard engineering tricks, because they require “centralization” and “trust”.
Back to the Bitcoin fork.
The fork is intended to deal with the problem of “running out of room” in the blockchain. This problem has been discussed for a quite a while, but, as I have discussed, there is no obvious solution. Every response has good and bad points, and, more important, different uses and goals are affected differently.
The BitcoinXT initiative / sect / coup takes the bull by the horns and simply increases the size of the blocks, up to 8MB initially, increasing further in the future. This at least pushes the problem farther into the future, at the cost in latency and possibly forcing smaller nodes out of business.
Resistance to this approach, aside from the religious and political objections, is about the effect on smaller nodes, and also on hoped for transaction fees. When the blocks are congested and space is in high demand, this is an opportunity to charge fees to jump the queue, and also to develop alternatives.
The queue jumping seems like a bad idea long term. At the extreme, poorer users get priced out and get long delays or even failed transactions. It’s hard to justify a Bitcoin that only works for the wealthy—that’s no better, and generally worse than conventional finance. (One of the useful effects of “centralized” systems, is they can enforce non-market notions of “fairness”.)
The alternatives under consideration involve the fundamentally sensible idea that a single, global blockchain can’t possibly support all the transactions in the world. So, there should be additional layers of record keeping, localized or specialized to be efficient, which use the blockchain as their backbone. And analogy would be how a merchant might call the local bank to verify a local patron’s check. Past that local confirmation, the check will then follow a slower process to be cleared and all balances adjusted.
There are many ways this might be done, and they all add complexity. They also make the “trust” equation more complex. Even if you “trust” the main Bitcoin blockchain, who knows if you should trust records from some sidechain? That depends on the sidechain. And if there can be an infinite number of such sidechains, how do we sort them out?
Another alternative is to change the one-size-fits all hard limit with a mechanism that enables miners (the ones most affected by this aspect of the protocol) to ‘negotiate’ the limit. This will allow the limit to gradually increase in the future, under the control of the operating nodes (rather than developers or investors).
This approach buys time, acknowledges “engineering reality”, and basically cedes control of Bitcoin to the miners. It also surely won’t work very well, it will basically amount to a gradual increase in the blocksize at a pace to suit a tiny segment of the Bitcoin community.
After all these words, what do you think should be done, Bob? Take a stand!
With the caveat that I have only a general understanding of the technical details, let me say this.
The BitcoinXT approach seems technologically harmless (8MB vs 1MB—not that big a deal). It also does not really solve the problem, except for the short run.
The “incentivize” processors by raising fees ideas is a non-starter. It will never work, at least not the way intended. Why? We already have plenty of for-fee transaction systems that work as well or better than Bitcoin for most uses.
The sidechain ideas are definitely the right way to go, including ideas like Ripple, which has its own blockchain, but could easily be anchored to the Bitcoin blockchain if you wanted to. However, these solutions are not strictly “peer-to-peer”, “trustless” systems. They can be engineered in any way you want, which is a good thing in the long run.
In fact, these aren’t mutually exclusive solutions, technologically. Raising the size of the block buys time to implement other, off-chain approaches that probably will be needed in the end.
But I have to say that I think the fork was a terrible idea technologically and sociologically. Taking my football and starting a new league is bad politics and a good way to break the technology. And it’s really, really, not the way to run a “community”.
In that sense, the ‘BIP 100′ proposal, which is technically similar, is much better community relations even though it actually makes the miners the unelected ‘central bank’ of Bitcoin. That will be bad in the future, if Bitcoin lasts that long.