The next stage in human technological evolution is a single thinking/web/computer that is planetary in dimensions. This planetary computer will be the largest, most complex and most dependable machine we have ever built. It will also be the platform that most business and culture will run on. The web is the initial OS of this new global machine, and all the many gadgets we possess are the windows into its core. Future gizmos will be future gateways into the same One Machine. Designing products and services for this new machine require a unique mind-set.
What are the dimensions of this global Machine?
Today it contains approximately 1.2 billion personal computers, 2.7 billion cell phones, 1.3 billion land phones, 27 million data servers, and 80 million wireless PDAs. The processor chips of all these parts are feeding the computation of the internet/web/telecommunications system. So how many transistors are powering the Machine?
An Intel Pentium processor circa 2004 has 100 million transistors in it, while a Itanium processor inside a server has over 1 billion processors since 2005. More current models have more transistors of course, but these older models would be closer to an average count.
One thing to note is that there are just as many processing chips in the Machine (one billion from the one billion online PCs) as there transitors in an Itanium chip. The Machine is a super computer where each “transistor” is computer. A very rough estimate of the computing power of this Machine then is that it contains a billion times a billion, or one quintillion (10 ^ 18) transistors. Since only the newest servers have a billion processors, the figure is probably an order of magnitude smaller. When we add the transistors for cell phones, handhelds, it calculates out to about 170 quadrillion (10^17) transistors wired into the Machine
There are about 100 billion neurons in the human brain. Today the Machine has as 5 orders more transistors than you have neurons in your head. And the Machine, unlike your brain, is doubling in power every couple of years at the minimum.
In 2003 alone a total one quintillion transistors were produced, but not all of them are wired up into the Machine. Many transistors made their way into cameras, TVs, GPS units and the like, few of which are currently online. One day they will be. Every chip will eventually connect to the web in some fashion. That would mean we would be adding as many transistors to the Machine in a year as exist right now.
If the Machine has 100 quadrillion transistors, how fast is it running? If we include spam, there are 196 billion emails sent every day. That’s 2.2 million per second, or 2 megahertz. Every year 1trillion text messages are sent. That works out to 31,000 per second, or 31 kilohertz. Each day 14 billion instant messages are sent, at 162 kilohertz. The number of searches runs at 14 kilohertz. Links are clicked at the rate of 520,000 per second, or .5 megahertz.
There are 20 billion visible, searchable web pages and another 900 billion dark, unsearchable, or deep web pages (for instance pages behind passwords or the kind of dynamic page that Amazon will produce when you query it). The average number of links found on each searchable web page is 62. Assuming the same count for dynamic pages that means there’s 55 trillion links in the full web. We could think of each link as a synapse — a potential connection waiting to me made. There is roughly between 100 billion and 100 trillion synapses in the human brain, which puts the Machine in the same neighborhood as our brains.
There were more than 5 exabytes (10^18) of information stored in the world in 2003, but most of this was kept offline on paper, film, CDs, and DVDs. Since then online storage has mushroomed. Today the Machine’s memory totals some 246 exabytes of information (246 billion gigabytes!). This storage is expected to grow to 600 exabytes by 2010.
But not all the information flowing through the Machine is stored. An increasing amount is generated, pushed, or into the net without anything more then temporary copies. One study estimates that the information stream in 2007 is 255 exabytes while the storage only 246 exabytes, and that this gap between generation and storage will widen to 20% per year. We might think of this total amount of information as “movage,” or even the 9 exabyte difference as RAM. The total movage estimated for the Machine in 2010 is one zetabyte (10^21).
To keep things going the Machine uses approximately 800 billion kilowatt hours per year, or 5% of global electricity.
One of the problems we have discussing this Machine is that its dimensions so far exceeds the ordinary units we are accustomed to, so we don’t have a way to reckon its scale. For instance, the total international bandwidth of the global machine is approximately 7 teratbytes per second. We used to talk about one Library-of Congress-worth of information (10 terabytes), but that volume seems absolutely puny now. In ten years terabytes will fit on your iPod. Keeping that metric for the moment, one Library-of Congress-worth of information is zipped around on the Machine every second. These are very deep cycles of processing. What will we use to measure traffic in another 15 years?
We could start by saying the Machine currently has 1 HB (Human Brain) equivalent . That measure might hold up for a decade or so, but after it gets to 100 HB, or 10,000 HB, it begins to feel like using inches to measure galactic space.
While personal computers are increasing in power roughly in tune with Moore’s rate, doubling every few years, the Machine can advance in power even faster because its total power is some exponential multiple of all the computers comprising it. Not only is the power of its “transistors” doubling in power, but the number of them are doubling, and the connections between increasing exponentially. Computer chip manufactures talk about making chips in 3D, rather than their conventional flat 2D now, in order to gain another dimension in which to expand the number of transistors. The Machine offers more than this. It can expand in all its many dimensions so that its power may be rising at a rate that exceeds the rates of its components.
Somewhere between 2020 and 2040 the Machine should exceed 6 billion HB. That is, it will exceed the processing power of humanity.