The Technium

The Machine That Made Us


[Translations: Japanese]

Computer scientist Joseph Weizenbaum recently passed away at the age of 85. Weizenbaum invented the famous Eliza chat bot forty years ago. Amazingly this pseudo-AI still has the power to both amusing and confuse us. But later in life Weizenbaum became a critic of artificial intelligence. He was primarily concerned about the pervasive conquest of our culture by the computational metaphor — the idea that everything interesting is computation — and worried that in trying to make thinking machines, we would become machines ourselves. Weizenbaum’s death has prompted a review of his ideas set out in his book “Computer Power and Human Reason“.

On the Edge Nick Carr says this book “remains one of the best books ever written about computing and its human implications. It’s dated in some its details, but its messages seem as relevant, and as troubling, as ever. Weizenbaum argued, essentially, that computers impose a mechanistic point of view on their users — on us — and that that perspective can all too easily crowd out other, possibly more human, perspectives.” He highlights one passage worth inspecting.

The computer becomes an indispensable component of any structure once it is so thoroughly integrated with the structure, so enmeshed in various vital substructures, that it can no longer be factored out without fatally impairing the whole structure. That is virtually a tautology. The utility of this tautology is that it can reawaken us to the possibility that some human actions, e.g., the introduction of computers into some complex human activities, may constitute an irreversible commitment. . . . The computer was not a prerequisite to the survival of modern society in the post-war period and beyond; its enthusiastic, uncritical embrace by the most “progressive” elements of American government, business, and industry quickly made it a resource essential to society’s survival in the form that the computer itself had been instrumental in shaping.

That’s an elegant summary of a common worry: we are letting the Machine take over, and taking us over in the process.

Reading this worry, I was reminded of a new BBC program called “The Machine That Made Us.” This video series celebrates not the computer but the other machine that made us — the printing press. It’s a four part investigation into the role that printing has played in our culture. And it suggested to me that everything that Weizenbaum said about AI might be said about printing.

So I did a search-and-replace in Weizenbaum’s text. I replaced “computer” with this other, older technology, “printing.”

Printing becomes an indispensable component of any structure once it is so thoroughly integrated with the structure, so enmeshed in various vital substructures, that it can no longer be factored out without fatally impairing the whole structure. That is virtually a tautology. The utility of this tautology is that it can reawaken us to the possibility that some human actions, e.g., the introduction of printing into some complex human activities, may constitute an irreversible commitment. . . . Printing was not a prerequisite to the survival of modern society; its enthusiastic, uncritical embrace by the most “progressive” elements of  government, business, and industry quickly made it a resource essential to society’s survival in the form that the printing itself had been instrumental in shaping.

Computer-Ebook

Stated this way its clear that printing is pretty vital and foundational, and it is. I could have done the same replacement with the technologies of “writing” or “the alphabet” — both equally transformative and essential to our society.

Printing, writing, and the alphabet did in fact bend the culture to favor themselves. They also made themselves so indispensable that we cannot imagine culture and society without them. Who would deny that our culture is unrecognizable without writing? And, as Weizenbaum indicated, the new embedded technology tends to displace the former mindset. Orality is gone, and our bookish culture is often at odds with oral cultures.

Weizenbaum’s chief worry seems to be that we would become dependent on this new technology, and because it has its own agenda and self-reinforcement, it will therefore change us away from ourselves (whatever that may be).

All these are true. But as this exercise makes clear, we’ve gone through these kind of self-augmentating transitions several times before, and I believe come out better for it. Literacy and printing has improved us, even though we left something behind.

Weizenbaum (and probably Carr) would have been one of those smart, well-meaning elder figures in ancient times preaching against the coming horrors of printing and books. They would highlight the loss or orality, and the way these new-fangled auxiliary technologies demean humanity. We have our own memories, people: use them! They would have been in good company, since even Plato lamented the same.

There may indeed be reasons to worry about AI, but the fact that AI and computers tend to be pervasive, indispensable, foundational, self-reinforcing, and irreversible are not reasons alone to worry. Rather, if the past history of printing and writing is any indication, they are reasons to celebrate. With the advent of ubiquitous computation we are about to undergo another overhaul of our identity.




Comments
  • Steve Witham

    I think Weizenbaum was right to warn against technical fixes and oversimplified models. But I think he made computers into a scapegoat, and based moral distinctions on flimsy grounds.

    I like the computer/print analogy. Here’s my take on that, where I talk about Google and “The Humanities” and being “in the driver’s seat.”

    Kevin, you write, “…The fact that AI and computers tend to be pervasive, indispensable, foundational, self-reinforcing, and irreversible are not reasons alone to worry. Rather, if the past history of printing and writing is any indication, they are reasons to celebrate. …We are about to undergo another overhaul of our identity.”

    I don’t think those are reasons alone to celebrate either. Sure, change is better than absolute stasis, but those are reminders that change is good to the extent we’re conscious of what kinds and qualities of overhauls we’re going through, reminders of how casual choices can become commitments and blinders.

  • glory

    a common worry: we are letting the Machine take over

    yea, from the luddites on to dune (i.e. mentats), terminator (vs. skynet), cf. BSG (cylon virus attacks on human computer networks), but i think chris crawford — http://www.metafilter.com/67750/Parmenides#1958210 — gets it right extending the cognitive shift that occurs from text to hypertext: “where sequential thinking imagines a line of nodes, subjunctive thinking sees each node as a branchpoint from which a thousand possibilities emerge. The workload of keeping track of all those possibilities is too much for the human brain to handle, but now we have a medium that is ideally suited for subjunctive thinking: the computer. Thus, the computer will permit the full exploitation of subjunctive thinking in the same way that writing permitted the full exploitation of sequential thinking. We are about to enter a new period in the human story every bit as brilliant as that of classical Greece…”

    iow, like you say, computers augment human intelligence,* which of course changes us in turn, but that’s far from letting them “take over” :P

    cheers!


    *http://denbeste.nu/cd_log_entries/2003/12/Superhumanintelligence.shtml & http://www.heise.de/tp/r4/special/glob.html

  • Joe Knepper

    Very interesting. I was going down a similar thought process this evening as I was writing a paper for my Master’s BI program. I was trying to describe the evolutionary trend we see with web sites taking on an “intelligent” human like quality. I came up with the term “online anthropomorphism” and went online to see what others have to say about this. A few clicks and links later, I arrived at your Technium site. I’m sorry to say I haven’t kept up with you over the years, which is a shame since you are one of my heroes. I guess I can’t be accused of hero worship. Years ago I read Out Of Control and it changed my life (for the better I think). I wish I could find a poster of the nine laws of god. If I had a religion these would be my 10 commandments. Now that I know where you are in cyberspace, I won’t let you get away. Thanks for being you Kevin.

    -Joe

  • http://theworstofperth.com The Worst of Perth

    Excellent comparison. Celerating the next upheaval is the way to go. BTW Linking to that Stephen fry Gutenberg video makes people outside the UK scream with frustration as The BBC, despite pretensions of being a global brand moronically only allows it to play inside the UK. Why everyone else gets that the internet is global except for them is incredible. Their claims of copyright problems just makes them look like bigger plonkers.
    The Worst of Perth

  • Rick York

    Mr. Kelly,

    It seems to me that we may be on the way to a full circle of sorts. If the cultural embedding of printing and computers is now behind us, can we view 3D printing as the next technology which will embed itself into our culture?

    While this technology currently is used mostly for prototyping, its potential seems as powerful as either of its predecessors. I have read recently (Unfortunately I cannot remember where. It might have been Bruce Sterling’s blog.) that someone is working on a 3D printer which will, among other things, reproduce itself. The potential impact of this technology on industry, economics and culture is hard to exaggerate.

    One can easily imagine the resistance an open source, self reproducing technology which individual humans, with the right templates, can use to enhance their own physical well being.

    With access to some basic raw materials, what T. Friedman called super-empowerment really could become ubiquitous.

    Hmm….

  • http://heybryan.org/ kanzure

    As with many posts on ai, chances are that readers will follow off to think about symbols, the grounding problem, and other interesting subjects. However, I offer another interpretation focused on that of fabrication and the process of creating (aggregation?). So, when you s/computing/printing/ in the Weizenbaum excerpt, a few tidbits are lost. Printing is fabrication, while computing is much more virtual. So as long as our eye is kept on self-augmentation (as you mention), as opposed to distancing it from the people and societies intricately linked to the tech, the programming and the social knowledge required to hack out lines of code otherwise unavailable to intuition. I am reminded of an Infinity Plus interview: “.. and more, I would say that the so-called virtual realities are misnamed: they should be called something like ‘simulated experiences.’ Because they aren’t real, and can never be so, any more than a map can be the territory. And more, for the same reason that a map is necessarily less detailed than the territory that it describes, a virtual reality can only ever be a pale shadow of the real thing. Such constructs might prove amusing, or even useful and illuminating, but how could they ever take the place of the essential reality that they represent?”

    - Bryan

  • Tom Buckner

    I read a wonderful book some ten years ago titled Self-Made Man: Human Evolution From Eden to Extinction, by Jonathan Kingdon. To oversimplify, his thesis was that we are not fated to alter ourselves with our technology: we already did. Our ‘primitive’ ancestors, for instance, all had massive teeth, mandibles, and jaw muscles. Cooked food bred that out of us. (As an aside, there’s much more that fascinates in this book: see the section on the invention of boats. One of the earliest floating devices was to lie prone on a log and paddle across a croc-infested river; the crocodiles left the swimmer alone because they thought he was one of them! Cleverness isn’t new.)

    As for the pervasiveness of the computational metaphor, I think this is a meme that hasn’t really made it to Main Street. More commonly, the non-geek majority view the computer through a vitalist metaphor. This is why poor simple Eliza was passing the Turing test decades ago with so many interlocutors.

    As the previous Zillionics post noted, Things Change when the numbers get big enough. When I first interaced with Commodore 64 computers, these machines were simple enough to do nothing very surprising. They still seemed like machines. By the time I had a 486 with Windows and a modem, the machine was complicated enough to seem to ‘have a mind of its own.’ It would do things I couldn’t understand. I still knew it was a machine, but it was beyond my ken.

    We don’t know where we are heading. But the Fermi Paradox is starting to become relevant to our own lives. If other civilizations ever existed in the galaxy, Where Is Everybody? Do technological civilizations always wreck themselves before they can spread off planet? Do thinking machines eliminate their makers, or save them? Do civilizations turn inward to a virtual world that has made the physical one seem dull? Or does the gas just run out and leave the people digging for turnips?

  • http://drewkime.blogspot.com Drew Kime

    The difference, I think, is that writing is something that can be entirely learned by each person who uses it. There is no risk of being unable to use the technology.

    I’ll grant that large-scale printing presses are far beyond the reach of individuals, and even whole cultures. But the base concept of printing is so basic that it could be easily duplicated, and scaled up by anyone with basic mechanical aptitude.

    Computation is a whole different animal. The utility of “pervasive computing” is explicitly the pervasiveness. The smart chip in my credit card is useless without a reader. Or a computerized register to attach the reader to. Or an internet connection between the register and the VISA system. Or … or … or …

    Look at any major urban blackout. Commerce comes to a complete standstill within minutes. You can literally turn the technology off, and society grinds to a halt.

    If you smash a printing press, everyone who already had a book still has it. If you burn all the books, people still know how to write.

    But disable the networks that allow us to engage in commerce, and suddenly the grocery store shelves go empty. And we don’t have local farms any more to fall back on.

    I’m not suggesting that we’re *likely* to face a protracted breakdown in the current system. But the *impact* of such a breakdown is certainly on a new scale.

  • http://cube3.com larryr

    The pasts alphabet or tools that accelerated it, NEVER could grow beyond our mediation and control.That just isnt true anymore of our latest tools/media.

    The network, the binary, and the machine are not in any way “tied” to us..and are being built to replace us. Except by their own ai driven dellusions if they choose to have them, they may not need or relate to us…

    Crap. just watch the aimless metaramblings of the cylons on every episode of the “improved” galactica to see the psychosis of the machines to come.;)

    Our identity is still pretty much the same as its been for 50k years. we eat, crap, and make more of us. Self awareness allows for a freedom of choice within Nature, and we act within its reality. and we pass on our failings and heroics to our children…we get what we are.

    All mediating technologies have done so far in application is “adjust” our relation to nature –,The Abomb, and now the AI NON human being so craved to soon exist.— are the exceptions only because they now represent the end of “adjustment” and the “reality of a final extinction”

    im not so sure the dinosaurs rejoiced in the mediation of the asteroid hit.

    except those i guess who believed in a rapture.

    .

  • Don Miller

    “Weizenbaum became a critic of artificial intelligence”

    Where is A.I.? We don’t have thinking machines, and I don’t believe we will any time soon. We don’t even have simple semi-A.I. processors such as optimal traffic control devices.
    I worked in software and investigated A.I. solutions for two decades. True machine intelligence is always a few years away.
    I saw a TED presentation that claimed the human brain will be duplicated by 2020. Ha. I find all these claims of “artifical intelligence” outlandish. It’s a field to designed to achieve tenure or venture capital.

  • jack

    Kevin,

    I am a student at the University of East Anglia in the UK. I recently wrote a paper about the advent of a “print-culture” in 18th Century England and how it spurred the development of the English novel. If this sounds interesting I’d be happy to e-mail you the paper and the references (there’s a book by McLuhan called “Gutenberg Galaxy – the invention of typographic man” that’s particularly good.