The Technium

The Computational Metaphor

The least-noticed trends are usually the most subversive ones. First on my list for an undetected upheaval is our collective journey toward the belief that the universe is a computer.

Already the following views are widespread: thinking is a type of computation, DNA is software, evolution is an algorithmic process. If we keep going we will quietly arrive at the notion that all materials and all processes are actually forms of computation. Our final destination is a view that the atoms of universe are fundamentally intangible bits. As the legendary physicist John Wheeler sums up the idea: “Its are bits.”

I first became aware of this increasingly commonly held (but not yet articulated) trend at the first Artificial Life Conference in 1987, where biological reproduction and evolution were described by researchers in wholly computer-science terms. The surprise wasn’t that such organic things could be given mathematical notations, because scientists have been doing that for hundreds of years. The surprise was that biological things could be simulated by computers so well. Well enough that such simulations displayed unexpected biological properties themselves. From this work sprung such fashionable patterns as cellular automata, fractals, and genetic algorithms.

The next step in this trend was to jettison the computer matrix and reimagine biological processes simply in terms of computer logic. But to do this, first computation had to be stripped from computers as well. Starting with the pioneering work of Van Neumann and Turing, a number of mathematicians concluded that the essential process of computing was so elementary and powerful that it could be understood to happen in all kinds of systems. Or, in other words, the notion of computation was broadened so wide that almost any process or thing could be described in computational terms. Including galaxies, molecules, mathematics, emotions, rain forests, and genes.

Is this embrace just a trick of language? Yes, but that is the unseen revolution. We are compiling a vocabulary and a syntax that is able to describe in a single language all kinds of phenomenon that have escaped a common language until now. It is a new universal metaphor. It has more juice in it than previous metaphors: Freud’s dream state, Darwin’s variety, Marx’s progress, or the Age of Aquarius. And it has more power than anything else in science at the moment. In fact the computational metaphor may eclipse mathematics as a form of universal notation.

This quickening of the new metaphor was made crystal clear recently in the work of mathematicians and physics who have been dreaming up the next great thing after silicon chips: quantum computers. Quantum computers lie at the convergence of two “impossible” fields, the world of the impossibly small (quantum states), and the world of the impossibly ghostly (bits). Things get strange here very fast, but one thing is strangest of all. In the effort to create mathematical theories of how matter works at levels way below subatomic particles, and in the effort to actually build computers that operate in this realm, some scientists have found that using the language of bits best explains the behavior of matter. Their conclusion: Its are bits. Young Einsteins such as mathematician/theoretical physicist David Deutsch are now in the very beginnings of a long process of redescribing all of physics in terms of computer theory. Should they succeed, we would see the material universe and all that it holds as a form of computation.

There will be many people who will resist this idea fiercely, for many good reasons. They will point out that the universe isn’t really a computer, only that it may act as if it is one. But once the metaphor of computation infiltrates physics and biology deeply, there is no difference between those two statements. It’s the metaphor that wins.
And as far as I can tell the computational metaphor is already halfway to winning.

(This was first published in the Whole Earth Review, Winter, 1998)

  • Andreas Sefzig

    Oops, late to the discussion… Anyways, I stick to this metaphor for quite a while now and, talking with friends about that matter, I often meet the argument “One tends to think that way because it is the most farsighted picture: yet only right now”.

    In a sense, this is true, since we dont have access to know about all future inventions. But the crux is – this metaphor will evolve hand in hand with us creating computers as childs of our very selves. So this metaphor will be rightfully used for as long as the dog runs after his tail.

  • Gray

    The acceptance problem that the theory has is mainly, I think, down to that word ‘computer’ and its suffusion with all sorts of design- and intentionality-laden connotations, simply because our computers are designed and built to carry out equally designed sets of algorithms.

    I see the confusion – like most confusions of paradigm – as one of language and definition, just like the ‘God’ confusion. As Dennett says at the end of his most recent book, when two people both worship Rock, but neither of them realises that they worship two different ones (Hudson and ‘n’ Roll), confusion is imminent.

    It might make more sense to say that the universe is algorithmic, because we’re all aware that algorithms appear everywhere without them having to have been designed. Newtonian mechanics is wonderfully and simply algorithmic. And so are the chaotic curlicues of non-linear dynamic systems (although they’re much harder to reverse-engineer to see what the algorithms are).

    The problem is that the name has already stuck, of course, so it’s beyond changing now. The WTF reaction to the idea cements it quite thoroughly.

    Knowledge databases are already undergoing transformations from hierarchical structures to flat, linked, tagged structures, and although routes through these can be ‘computed’ far less efficiently, they do produce more ‘meaningful’ – to us – representations of information.

    As we try to understand more and more of the way that information behaves in the universe – this massively highly fed-back network – we’ll almost certainly have to reorganise our first attempts at how computers should behave. I think we’ll have to ‘go analogue’. It’s no surprise that out brains can ‘understand’ so much of our environment (at least from a functional point of view), having evolved to do just that, but that computers cannot even come close to the kind of intelligence that we can manage in dealing with the world.

    Maybe the digital revolution is about to run its course, like the sail ships of old, and the optical quantum computer will be the thing that makes ‘sense’ of everything. I just know this is what Deutsch is banking on. The problen will then shift to having a clue what ‘understanding’ is anymore, because any network sufficiently complex to make sense of the world will be too complex to make sense of itself, or for us to make sense of. We’ll be trading in reductionist understanding for… what? Created peers, I suppose.

  • Kevin Kelly

    “I think we’ll have to ‘go analogue’. ”

    I call that the revenge of the analog and no doubt its something we can expect to surprise us in the future.