The Central Fate of the Blockchain (In Case There is a Future at All)

/********
Recently an essay of mine has been published in the German issue of Technology Review (TR 10/2018), in which I examine the history of the internet in order to predict the fate of the blockchain technology, especially regarding to it’s promise of decentralization. This is a translated and also extended version of the German text.
********/

The internet interprets censorship as damage and routes around it.“ The sentence became the battle cry of the early internet activists in the 1990s. It was coined by John Gilmore, co-founder of the digital civil rights organization Electronic Frontier Foundation (EFF) in a 1993 interview with Time Magazine.1 It summed up the internet’s promise of technological freedom in a nutshell: „If you put the scissors in one place on the internet, the flow of information will simply bypass that place and still arrive unhindered at the recipient.“ This uncensoredness has always been an essential part of the Internet’s promise of freedom and is based on its „decentralization“.

Looking back, one can argue whether the internet has ever delivered on this promise. Today, Google, Amazon and Facebook laugh at the dreams of a hierarchy-free internet. The Internet has certainly shifted the balance of power here and there, but it has also concentrated and monopolized it in ways unimaginable at the time. And while some Chinese people still manage to smuggle some unauthorized bits into the country, the government’s censorship efforts through the „Chinese firewall“ can certainly be regarded as successful.

But the same promise of freedom of decentralization is now part of the blockchain discourse. Like the internet back then, the blockchain is now supposed to make censorship efforts and state influence impossible. Like the internet then, the blockchain today is supposed to dismantle hierarchies, strengthen the periphery, give a voice to the weak and give us all our freedom; unregulated, wild and at a level playing field. With the blockchain it should be possible to operate online services „serverless“ – i.e. without a centralized computer infrastructure – because the data is distributed on all the computers of the participants. This would make it possible – blockchain enthusiasts believe – to place the digital infrastructure and its control in the hands of the users. With the decentralizing power of the blockchain, the original promise of the Internet would be finally within reach. But it is precisely the history of the internet that offers some objections to these myths.

The Birth of the Internet from the Hardware that was Available

However, the origin of the decentralization of the internet had nothing to do with any idea of freedom in the first place, but resulted from plain necessities. Paul Baran, who is regarded as one of the first pioneers of today’s internet, was a member of the RAND Corporation, a think tank close to the Pentagon. In his collection of essays „On Distributed Communications“2 of the early 1960s, he mentions two major reasons why a decentralized computer network should be built: The first was a military necessity: at the height of the Cold War, the Pentagon was interested in transferring data from the numerous radar stations for airspace surveillance (SAGE Project) quickly and securely to the command center. The network was supposed to function even when individual nodes were destroyed. The second reason was economic in nature. At a time when computers were as rare as wonders of the world and almost as expensive, a decentralized computer network offered the opportunity to utilize the existing, extremely expensive computer hardware available.

Baran’s central idea for solving these problems was packet switching, the concept of splitting data into individual packets and sending them individually from station to station until they arrived at their destination. Instead of building a large supercomputer to control all connections, many less powerful computers could be utilized to establish the data transmission.

When the ARPANET, the first precursor of the internet, was put into operation in 1969, the decentralized approach actually allowed the processing load and the costs of communication to be distributed among many computers and institutions, which in turn were distributed over the entire territory of the USA. In addition, the network was easy to expand.

Ten years later, when Robert Kahn and Vint Cerf laid the cornerstone of today’s internet with the development of the TCP/IP protocol suite, they took over packet switching and also introduced the end-to-end paradigm. The intelligence of the transmission – the splitting and sending of the data packets as well as the control over incoming packets – should lie solely at the ends of the transmission, i.e. at the sender and receiver. All intermediate stations should only be „stupid“ routers, which just threw the data packets to the next node without getting an overview of what was actually happening within the network. This design element, which was also transfigured as a particularly liberating element because it left the control of communication to the user’s computers, also had a concrete, technical purpose. In the course of the 70s a zoo of different network protocols had evolved. Cerf and Kahn therefore developed a standard that could act as an intermediary between different protocols. As long as the ends spoke TCP/IP, the transmission could run over all possible network standards in between. This way, TCP/IP could actually connect heterogeneous networks with each other. The result was a network between networks, an INTER-NET.

Free as in Free Scale

In 1982, just before ARPANET was connected to the internet, it had about 100 nodes and the number of connections per node already varied greatly. As more and more universities and institutions connected to the network, individual nodes became too much frequented hubs and some data lines became the early backbones. Over time, the network looked less like a fishing net, where all nodes had more or less the same number of connections, but the „nodes to edges (connections) ratio“ approached the Power Law Distribution. Roughly speaking, the top 20% of the nodes had 80% of the connections and the remaining 20% of the connections were distributed in a „long tail“ to 80% of the nodes. The Arpanet resembled more or less a network of roads, with its highways and main roads and smaller secondary roads, or like a tree branching out in ever finer branches. Network topographically one also speaks of a „scale free network“, a network in which the node-edge ratio always remains the same as with a Mandelbrot set, no matter from which zoom level one views the network.

Scale freedom very often occurs with organically grown networks, because as a new participant in a network it makes sense to connect to the largest node possible. Behind this is a hidden economy: one needs less „hops“ to reach one node from any other. Clumps, it turns out, are abbreviations. It became apparent that even a distributed approach to data transmission brings its own centralization tendencies. Finally in 1986, NSFNET (National Science Foundation Network)3, the first backbone – a kind of main artery – between individual research institutions in the USA, was built into the still young internet and formally introduced a hierarchy of data lines.

Scaleless networks are both centralized and decentralized, because instead of one center, there are many large and small centers. It helps to imagine decentralization as a spectrum. On the one end, we’ve got a network with a central hub that regulates everything and is therefore incredibly efficient, because every connection from one point to the other requires exactly one hop. On the other end of the spectrum would be the mesh network, where all nodes have the same number of connections, but a communication between two nodes must, in doubt, hop through hundreds of nodes in between. So the scale free network is a kind of compromise between decentralization and efficiency.

Such concentrations and clusters by large internet providers such as AT&T, Telekom and international consortia such as Level 3 also exist on the internet and Google has already put its private second internet next to the public one, but even today hundreds of thousands of small and large internet providers worldwide still serve billions of people on the basis of the common protocol basis, thus keeping the promise of decentralization at least at this level.

However, the first reason Paul Baran cites for decentralisation – stability against failures due to military or other attacks – is only conditionally valid due to the internets freedom of scale. In any case, this is the result of theoretical studies conducted by network researchers such as Albert László Barabási and others in Nature.4. According to the study, a random collapse of up to 80 percent of the nodes would keep the network stable. But if an „informed attacker“ were to attack the central nodes in a targeted manner, Barabási wrote, it would be relatively easy to switch off the entire internet. A prediction that has become considerably more explosive due to the major DDoS attacks of 2016, which paralyzed Twitter, PayPal, Netflix and Spotify over several hours.5 Although the number of such attacks has fallen in the meantime, security experts are by no means giving the all-clear.

The Hidden Costs of Decentralization

So the internet has actually become much more central. But not at all levels equally. While a largely decentralized approach still prevails on the lower layers, the most concerning signs of concentration have taken place above them. To visualize this, one has to imagine the internet as a stack where the protocols layer on top of each other. At the lowest level there are protocols providing WiFi, Ethernet or DSL and back in the days the ARPANET, which has been switched off in the meantime. These are the protocols that TCP and IP have been able to connect with each other by putting themselves on top as a general standard. On top of TCP/IP there is the so-called application layer. This is where our everyday internet usage actually happens. E-mail, WWW, but also the apps on our smartphones are part of this layer. And while decentralized approaches such as e-mail and the World Wide Web initially flourished on the application layer, it is precisely this layer that is dominated today by Google, Facebook, Amazon and other monopolistic, centralized platforms.

This concentration in the application layer is inevitable because innovation can hardly take place on the underlying layers. Decentrally implemented protocols such as TCP/IP have the major disadvantage of being immune to any form of further development due to their „path dependency“. Once a path has been followed, it can no longer be changed significantly and any further development must be based on the previous design decisions. You can see that effect by looking at the transition of internet addresses from IP Version 4 to IP Version 6, which has been underway for 20 years now and still isn’t finished yet. Once a distributed approach has been poured into a protocol, it develops a unruly structural conservatism. You can’t just flip a switch somewhere to update the system. No, billions of people have to flip billions of switches. And in case of doubt they say: why should we? Why change a running system? As a result, the actual innovation has been pushed upwards. Instead of equipping the network protocol with new features, the services were developed on top. That was certainly the idea, but it opened up a whole new space that, although based on decentralized structures, made new centralized services possible, and – in a sense – inevitable.

But why is the application layer dominated by large corporations today, when in the 1990s decentralized approaches like the WWW, e-mail and other protocols were initially predominant?

An answer to this is provided by the “economies of scale”. In the industrial society it meant that enormous cost reduction effects would occur if a product was manufactured 100,000 times instead of 1000 times. The same applies to the digital world. Amazon needs considerably fewer employees and consumes less power to operate a data center with 100,000 servers than 100 hosting-providers need to keep their 1000 servers running. Add this to the fact that server-based services such as Facebook and Google can roll out innovations easily, while protocol-based platforms are always stuck in their current state due to their path dependency, and the dominance of centralized services is virtually self-evident.

Related to the scale effect is the network effect – a scale effect on the demand side – also known as Metcalfe’s Law6 in the networking world since the 90s. Robert Metcalf, one of the engineers of the Ethernet standard, formulated that the value of a network increase proportionally to the square of its participants. The more participants a network has, the greater the benefit for each individual. A growing network thus becomes more and more attractive and develops a pull effect on potential network participants through this positive feedback loop. In the end, everyone is on Facebook because everyone is on Facebook. However, everyone has e-mail because everyone has e-mail and Facebook and e-mail are based on TCP/IP because almost all internet services are based on TCP/IP. In other words: Network effects work for decentralized platforms as for centralized ones.

However, this effect has a negative effect on many decentralized platforms via detours. In the early 2000s, Google had shown how a central search index could render a decentralized network like the millions of websites on the WWW actually useful. This was shortly before Facebook showed that it was possible to do without decentralized elements by simply letting users create the content directly on the platform. Both Google and Facebook show that central data storage has a special advantage: it can be searched. And it’s the searchability that often makes the network effects really come to the fore. What good is it to have your friends communicate on the same standard as yourself when you can’t find them anyway?

While the internet protocol works fine without central searchability, because it only has to know it’s routing table to find the next router, non-searchability, combined with the existence of a disproportionately large competitor, is the main obstacle to the growth of alternative, decentralized social networks. That’s why Diaspora, Status.net, Mastodon and all other alternatives to Facebook and Twitter never really took off.

The lack of searchability is indeed one of the problems that blockchain technology has addressed with some success. Because all participants of the network can view and search all interactions in the network, network effects can unfold unhindered despite a lack of central data storage.

But this generates costs elsewhere. Not only is there a need for millions of parallel data stores instead of one data store for each process, but there is also the problem that these millions of data records have to align each other to a common state again and again. This alignment problem is essential, because otherwise every participant could spend his or her Bitcoin or Ether several times, the so-called „double spending“. This problem alone devours the annual energy budget of Austria only for the Bitcoin network.7 And even if less energy-hungry agreement procedures are already being applied to other crypto currencies, any solution, no matter how sophisticated, will always be thousands of times more complex than a simple „write“ into a conventional, central database.

Meanwhile, the scale effects of clumping undermine the blockchain promise. Bitcoin Gold – a Bitcoin variant – has already experienced a 51% attack.8 This is an attack, in which an attacker brings 51% of the computing power of the network under his oder her control, in order to write on the blockchain on his own authority; for example stealing money by doublespending. Back, when Bitcoin started, this was a purely theoretical possibility, today – where mining has professionalized and computing power has concentrated on a few players, it has become a real possibility that some miners could join forces or rent additional computing capacity to carry out such an attack.

The structural conservatism of path dependency also makes the blockchains difficult to develop further. A recent attempt to change Bitcoin in order to increase the block size from currently 1 megabyte to 1.8 megabyte, failed.9 This would have dramatically increased the speed of transactions, which had been down to several days in the meantime. But for a hard cut (fork) you have to have the majority of the community (at least 51% of the computing power) on board, but they often have their own agenda to protect their possessions. Just like in the analogous capitalism, the established forces profit from the status quo and for that reason oppose change.

For Bitcoin, Ethereum and many other crypto-currencies, additional external services to enrich the protocols with extra services are already in development. Wallet service for example have adopted the attitude of storing the huge blockchain data on central servers. The coin-exchanges, where you can buy and trade Bitcoin, Ether and co., are popular, and therefor central points of attack – for hackers as well as for law enforcement. Ethereum applications (dApps) are distributed by design but are often managed through centralized Web sites. In other words: it is already happening what has happened to all decentralized approaches: new services move to higher layers and establish new centralities there.

The Historical Dialectic of Decentrality

It is far from obvious whether or when blockchain-based technologies will really have the disruptive impact on our centralized internet that it’s said to have.10 Currently, most blockchains are still solutions looking for a problem. Their only unique selling point – decentralization – has a torrent of hidden costs attached to it, which already proofed prohibitive for similar approaches in the past.

However, important insights can be drawn from the history of the successful and less successful decentralized architectures of the internet. Decentralized approaches seem to work when infrastructure is geographically distributed anyway, as it is the case with regional and national internet service providers. They work when the decentralized infrastructure doesn’t need to be significantly further developed because innovation can move to higher layers. They also flourish when you can manage to make them searchable, as Google did for the WWW and Thepiratebay for Bittorrent. When you can reduce the extra costs of decentralization enormously, or justify them with a huge extra benefit, as in the early days of thei. It also helps immensely if what you build in a decentralized manner does not already exist as a centralized service. This is the only explanation a standard as inadequate as e-mail could still prevail – and last so long.

So let’s imagine that enough of these criteria have been met for a new, decentralized, protocol-based infrastructure based on blockchains to raise its head. Are we finally free then?

I doubt so. A historical dialectic can be observed in the history of decentralized architectures. Due to the inherently structure-conservative path dependency of decentralized architectures, innovation shifts to higher layers, in which new players can play out their strengths given by centrality.

Let’s imagine the following scenario. The Ethereum network produces its first truly meaningful and widespread killer app, the VisiCalc11 of the blockchain age. Let’s call it Woolit. Woolit is a dApp – a decentralized app – for buying, exchanging, managing and storing the 100 most important crypto-currencies. It’s not just a wallet, but it is connected to its own coin exchange, which makes dealing with all kinds of crypto-currencies super easy.

Now this dApp needs a website for advertising and in order to administrate your account and operate the coin-exchange. The Woolit website is conventionally stored on a web-server. The interface does no longer write to a database, but to the Ethereum blockchain, which doesn’t make a visible difference in user experience. The company also publishes apps for iPhone and Android, which can also be downloaded from the website. The blockchains of the respective cryptocoins are also stored on the central server for the sake of simplicity and efficiency.

However, the popularity of the app only really gets through the roof when it introduces a feature that processes transactions among its users in milliseconds and makes them completely free of charge. This works via an internal escrow system that executes the transactions in parallel on the server-side database and guarantees the value transfer until the actual blockchain transfer is completed. The fiduciary system can suddenly also be used to limit fraud as transactions can be recalled automatically. The Woolit-Wallet automatically instructs the fraudulent party to return the money. If such an intervention does not suit you, you can give up your Woolit and switch to another Wallet. The Lightning Network12, which has been under construction for several years and is supposed to provide similar functionality via a decentralized architecture, is still not finished at this point and therefore has nothing to oppose Woolit’s fast and technically pragmatic solution. But Woolit is now as simple, convenient and secure as any other payment app on the market and makes handling all major crypto-currencies a mass-market for the first time.

Woolit is such a great success that it initially drives most other wallet systems off the market, then gradually many coin exchanges. The Woolit exchange begins to differ from its competitors, offering features and conditions that the others cannot keep up with. Woolit starts taking fees from other exchanges when they want to transfer money to Woolit customer IDs. Retail stores now all have a Wooli logo on their checkout systems that indicates that customers can pay conveniently with the Woolit app. Soon Woolit makes its customers a special offer: if they only transfer money within Woolit and pay things, they get their fees waived every twelfth month. Most Woolit customers join in.

One day Woolit receives an official request from the American State Department. They are asked to freeze and block all accounts of the Swedish carpet trading company Carpet.io because it is guilty of doing business in Iran contrary to the sanctions. Of course Woolit complies since it is based in the US. Woolit can’t delete or freeze accounts on any blockchain systems, but it can block access to them via its woolit-interface. Of course Carpet.io can now use another wallet – there are still a few open source projects on Github – but these are slow and usually don’t support all features or all coins that Carpet.io got. In addition, Carpet.io has lost access to the Woolit exchange and has to go through other exchanges that have worse prices and features. Most importantly, they lost access to most other coin-owners, because most of them are Woolit-customers and – in order to save the fees – only exchange coins exclusively there. That’s faster, safer and more convenient anyway. Carpet.io gives up and files for bankruptcy.

Today Woolit has 50,000 employees, data centers worldwide and is the third most valuable company in the world. It also has the most mining capacity and could easily launch a 51% attack on the majority of its hosted crypto currencies. But that would only crash the value of these crypto-currencies and why would it hurt itself with such nonsense? Woolit customers also understand this and therefore trust the company. Just like most governments with which Woolit has a trustful relationship since it, together with some authorities, has dried up most of the organized crime. Money laundering has become difficult since Woolit dominated the crypto-market. Who needs decentralisation anyways?

  1. Philip Elmer-Dewitt: First Nation in Cyberspace, http://kirste.userpage.fu-berlin.de/outerspace/internet-article.html
  2. Paul Baran: On Distributed Computing, https://www.rand.org/content/dam/rand/pubs/research_memoranda/2006/RM3420.pdf
  3. Wikipedia: https://en.wikipedia.org/wiki/National_Science_Foundation_Network
  4. Albert László Barabási, Reka Albert, Hawoong Jeong: Error and attack tolerance of complex networks, https://www.researchgate.net/publication/1821778_Error_and_attack_tolerance_of_complex_networks
  5. Nickey Woolf: DoS attack that disrupted internet was largest of its kind in history, experts say, https://www.theguardian.com/technology/2016/oct/26/ddos-attack-dyn-mirai-botnet
  6. Wikipedia: https://en.wikipedia.org/wiki/Metcalfe%27s_law
  7. This obviously is changing constantly. https://digiconomist.net/bitcoin-energy-consumption
  8. OSATO AVAN-NOMAYO: 51 PERCENT ATTACK: HACKERS STEALS $18 MILLION IN BITCOIN GOLD (BTG) TOKENS, https://bitcoinist.com/51-percent-attack-hackers-steals-18-million-bitcoin-gold-btg-tokens/
  9. Kyle Torpey: The Failure of SegWit2x Shows Bitcoin is Digital Gold, Not Just a Better PayPal, https://www.forbes.com/sites/ktorpey/2017/11/09/failure-segwit2x-shows-bitcoin-digital-gold-not-paypal/
  10. Michael Seemann: Blockchain for Dummies, http://www.ctrl-verlust.net/blockchain-for-dummies/
  11. VisiCalc of Dan Bricklin and Bob Frankston, http://history-computer.com/ModernComputer/Software/Visicalc.html
  12. Wikipedia https://en.wikipedia.org/wiki/Lightning_Network
Dieser Beitrag wurde unter english, extern, Plattformpolitik abgelegt und mit verschlagwortet. Setze ein Lesezeichen auf den Permalink.

Ein Kommentar zu The Central Fate of the Blockchain (In Case There is a Future at All)

  1. Pingback: Der Sog des Zentrums | Technology Review | Heise Select | H I E R

Schreibe einen Kommentar

Deine E-Mail-Adresse wird nicht veröffentlicht.