Featured

Bitcoins Aren’t Dead – They’re Actually Growing Faster Than We Can Handle

There’s tension in the bitcoin community – it’s never been this bad. There’s drama. Good developers like Mike Hearn are leaving the community, although his writing the cryptocurrency off was not the best thing to do. But let’s discuss why this is happening at all.

First, the basics. This short video by Mashable from 2014 should have you covered.

So, all the actors involved in the current crisis want bitcoins to succeed. More importantly, they want it to scale – they want the cryptocurrency’s capabilities to increase. When bitcoin was launched in 2008, scale wasn’t on anyone’s mind because it was a fascinating new experiment and everyone knew there’d be a lot of kinks to iron out. When Satoshi Nakamoto, the mysterious creator of the experiment, wrote the code that’d implement bitcoins, he imposed an arbitrary cap of 1 MB on a block. A block is the nuclear unit of the blockchain that determines how much computing capacity is available to verify transactions. And the block size stayed that way because the people participating in the experiment didn’t expect it to become a problem. Plus, you could do a lot with 1 MB.

However, come 2015, there were so many transactions that the blocks were getting filled really quickly, and there was overflow. Every block can contain 1 MB of data, and each block is on average created every 10 minutes. This means that for every 10 minutes, the bitcoin blockchain can on average contain 1 MB of data. So as the number of transactions increases beyond what a block can handle, a queue starts to form. For example, transaction A could be included in a block but not transaction B because the block was full, so it’s saved for the next block. So B is kept waiting until the next block becomes available (10 minutes later). In the meantime, more transactions join the queue.

Over time, the number of unconfirmed transactions only keeps increasing. If Alice wanted to send Bob a bitcoin from her wallet to his, the average confirmation time was supposed to be, on average, 10 minutes. But because of the building traffic jam, it now takes about 30-50 minutes for one confirmation. And if it scales from this point on, the bitcoin blockchain will just defeat its own purpose.

Screen Shot 2016-01-20 at 4.16.26 pm

In a way, it’s good news that a lot of people are performing bitcoin transactions – the highest ever as of January 2016. The only problem is that now there’s a big queue. And the miners, who are producing the blocks, are now starting to prioritise transactions based on how much they’re worth. And the more they’re worth, they more the miners get out of it. Bitcoins started out as the cheapest way to move money, but if miners are going to prioritise transactions based on the transaction fee that’s available, the more better-off transactors are going to win out.

If the block size isn’t increased beyond 1 MB, the fees are going to climb steadily, and the cryptocurrency will become unusable. Everyone in the community understands this.

Now, where did the crisis originate?

The logically most accessible step would be to increase the block size beyond 1 MB – to 2, 4 or 8 MB, or whatever – but it’s not the easiest to take. This is because the stakeholders most affected by this change would be the miners, and there are many consequences for them.

The miners are responsible for creating new blocks, providing the computing power for transactions, and broadcasting them to other nodes (basically people) on the blockchain. In effect, they are responsible for the security of bitcoins, and they have a significant and legitimate request. They say that 1 MB is a lot of data – which is true – and it’s transmitted via the Internet, consuming bandwidth. So if the block size is doubled, the bandwidth consumption would double, too, as well as the rate at which new data is added to the blockchain. This is something the miners aren’t okay with, especially those situated in countries like China where there are stringent bandwidth restrictions. They’re asking for a smarter solution, one that doesn’t require doubling the block size once every few years – especially if the problem is one of scale and exponential growth is in the offing. It shouldn’t bear the price of exponential bandwidth charges.

Many of the core developers agree that this is a problem but aren’t convinced that a smarter solution exists. Anyway, they organised a conference called ‘Scaling Bitcoins’ in December 2015, and a lot of interesting presentations were made. In the one in Hong Kong on December 7, one was made for a technique called segregated witnesses by developer Peter Wuille. As Coindesk reported a core developer as endorsing, it could provide a “fourfold increase” in network capacity in a “short time-frame”.

The technique at the heart of the ongoing crisis is the hard-fork. The bitcoin code – which implements the cryptocurrency, allows miners to produce new blocks, and dictates there will only ever be 21 million units in circulation – is a unified, self-consistent entity that the community’s core developers are entrusted with safeguarding as well as upgrading. The code that runs today isn’t exactly the one set out by Satoshi in 2008. Users have been submitting changes to it, which are peer-reviewed and discussed before adoption. And in order to safeguard the integrity of bitcoins as well as the blockchain, it’s best if everyone on the network ran the same version of the code at any given time.

In a soft-fork, the code is updated (to fix some issue, etc.) but not everyone on the network is mandated to use the new version immediately. In other words, it’s the kind of update that, if you didn’t adopt it right away, you wouldn’t be excluded from the network. With a hard-fork, the costs would be steeper, involving the sort of update that, when skipped, will result in serious errors during the mining and transaction-verification processes. But the worse consequence would be that people continuing to run the ‘old code’ will automatically start maintaining their own blockchain, separate from those running the ‘new code’. When such a fork results, the history of transactions upto that point of forking would’ve been common but from that point on will have two distinct versions – and it’ll be pandemonium before there’s consensus on which is the legitimate one.

Some developers have started canvassing support for different versions of such hard forks, each resulting in different solutions. If any one of them is implemented, it would result in miners processing different transactions in different ways. Among them is a ‘classic’ fork that supports the plain-jane doubling to 2 MB block limits. And the tension in the community as you read this concerns whether or not this ‘classic’ fork should be implemented.

Personally, I’m for a soft-fork, not a hard one – the sort of fork, or update, that’s implemented in phases and can afford to wait for all miners to catch up over time yet still not affect current and future transactions by introducing small, performance-enhancing changes in the blockchain and mining and transacting networks (as it happens, Wuille’s presentation in Hong Kong was about how segregated witnesses could be deployed as a soft-fork). Beyond this, there should also be a coherent plan that recognises that the transactions will grow rapidly in number and size and that spells out procedures to be followed each time the core developers have to implement a community-wide solution without letting any stakeholders down.

Some people decry that these developers weren’t unanimous in their decision-making and that that’s been the source of problems. The developers are not expected to be unanimous about anything. On the other hand, it would’ve been better if they had communicated what measures they were taking to address the issue – even if they were confused by what steps to take next – and hadn’t let problems simmer and get to the point where they’re adversely affecting public perception.

In sum: bitcoin is in a good situation. We are hitting capability issues, which means we need to increase something. The simplest solution is to increase the block size, but if done as a hard-fork, it could affect miners by making them cough up more for bandwidth and it wouldn’t be a permanent solution either. A soft-fork could work here – where developers optimise the implementation code and increase the efficiency with which transactions are verified and broadcast without, in turn, demanding more bandwidth. Although the ‘classic’ hard-fork still enjoys a lot of support at the moment, especially from online exchanges that’s wary of large transaction backlogs, it isn’t yet clear if it’ll be the way forward.

It’s also not the best time to invest in bitcoins – but it’s not the end of bitcoins either as much as a heartening sign of things to come. This is the first time such a problem has befallen the bitcoins community and it will be interesting to see where it goes from here. This is also the first, and most illuminating, time to observe what happens when the core developers aren’t able to make a unanimous decision about the future of the cryptocurrency. In fact, this is history in the making – what happens now will make for allegories in the community’s future. My hope is that it will give rise to a new model for the way we seek and build consensus in the development of a decentralised value-exchange system.

Vignesh Sundaresan is the CEO of BitAccess, a Canadian company that makes bitcoin ATMs.

Featured image: A grid of processors working on mining bitcoins. Credit: moia/Flickr, CC BY 2.0.