• Go to The Future Does Not Compute main page
  • Go to table of contents
  • This document: http://www.praxagora.com/~stevet/fdnc/ch02.html
  • The Machine in the Ghost




    This is Chapter 2 of The Future Does Not Compute: Transcending the Machines in Our Midst, by Stephen L. Talbott. Copyright 1995 O'Reilly & Associates. All rights reserved. You may freely redistribute this chapter in its entirety for noncommercial purposes. For information about the author's online newsletter, NETFUTURE: Technology and Human Responsibility, see http://www.netfuture.org/.

    The intelligence of computers is delivered upon tiny chips made of silicon -- just about the most homely and earthy material known to man. Silicon amounts pretty much to sand. Apply a few volts of electricity to some duly prepared slivers of silicon, and -- if you are like most people -- there will suddenly take shape before your eyes a Djinn conjuring visions of a surreal future. It is a future with robots who surpass their masters in dexterity and wit; intelligent agents who roam the Net on our behalf, seeking the informational elixir that will make us whole; new communities inhabiting the clean, infinite reaches of cyberspace, freed from war and conflict; and lending libraries of "virtually real" experiences that seem more sensational than the real thing -- all awaiting only the proper wave of industry's well-proven technological wand.

    As you probably realize, not all of this is idle or fantastic speculation -- even if it is the rather standard gush about our computerized future. Something like this is indeed coming -- in fact, has already arrived. And few observers can see any clear limits to what computers might eventually accomplish. It is this stunning, wide-open potential that leads some people to wonder what the Djinn will ask of us in return for the gift. After all, any potential so dramatic, so diverse, so universal, can be taken in many directions. That is its very nature. Who will choose the direction -- we, or the Djinn?

    Tools get under our skin

    As far back as human traces go, man has used tools. But tools are slippery things -- exceedingly hard to define. Everything from a hand-held stone to language has been seen as a tool. Tools are, by most definitions, extensions of ourselves, and are both the result of and the means for our acting in the world. Even our own limbs may be used as tools. In fact, we can readily view our limbs as "archetypes," or primary examples, of what it means to be a tool.

    But haven't I lost my way if I can't tell the difference between a tool and myself, or even between a tool and my own, word-borne thoughts?

    Well, maybe not. At least, there's a truth here worth going after: When we talk about tools, we are, one way or another, talking about ourselves. There's a depth of meaning in the old saying, "To someone who has only a hammer, everything begins to look like a nail." We not only shape things with our tools; we are shaped by them -- our behavior adapts. This has been recognized in many different arenas. You may, for example, have heard the expression, "the medium is the message" -- that is, the tools we use to communicate a message affect what we say. One consequence is that you will probably find yourself putting on a different "personality" when you compose an electronic mail message, from when you write a note on stationery. Somehow -- rather uncannily -- tools always seem to get "under our skin."

    The unconscious as steam engine

    One other thing is undeniable about tools: over the course of history they have become increasingly complex. This is a fascinating study in itself, for there seem to be certain thresholds of complexity -- or, perhaps, thresholds in our own minds -- beyond which the character of the tool is mightily transformed. There were, of course, various mechanical devices far back in history -- for example, looms and hoes and catapults. But during the Scientific and Industrial Revolutions, the cleverness embodied in mechanisms changed with extreme rapidity, entailing a kind of systematic, rationalized intricacy not seen before. A modern offset printing press or harvesting combine is as far removed from the loom of an ancient Greek household as -- well, as we feel ourselves to be from ancient Greece.

    Since a radical transformation of tools implies a parallel transformation of the tool user, we are not surprised to learn that the age of great mechanical invention was also the age during which our ancestors of a few hundred years ago began to "feel" as if they inhabited a clockwork universe. Here's how Owen Barfield describes the matter:

    I recall very well, when I was writing my early book, History in English Words, /1/ being astonished at the ubiquitous appearance of the clock as a metaphor shortly after it had been invented. It turned up everywhere where anybody was trying to describe the way things work in nature .... Coming a little nearer to our own time [the student of words] finds the psychology of the unconscious, in which the first half of the twentieth century felt so much at home. Strange how squarely it seems to be based on an image of "repression," which is much the same as compression! Was it after all just the steam-engine in disguise? /2/

    Barfield was writing before the computer age. If he were penning those words today, he would have to cite, not just another "contraption," but something strangely transcending all the products of the machine era. For now we seem to have crossed another threshold -- one carrying us far beyond the most impressive achievements of the Industrial Age. Computers offer us an entirely new order of complexity, intelligence, flexibility. They achieve what we could scarcely imagine an old-style mechanism achieving. These modest, unimposing devices on our desks demonstrate a remarkable capacity to emulate -- that is, to become any tool. Need a calculator? Typesetter? Mailbox? Pencil and paper? File cabinet? Library? Tape recorder? There they are, sitting in front of you, awaiting your command. A computer can even become, in its own way, a tornado or ocean wave, modeling these things in such a compelling manner that some theorists now believe the essence of the tornado or wave really is in the computer.

    But this new threshold is even more momentous than I have so far suggested. The truth that we cannot talk about tools without talking about ourselves, now becomes stunningly literal. Not only do our tools reveal things about us, they promise to become us! That, at least, is what many people think is happening through research in artificial intelligence. Other people worry that, because tools inevitably work their way under our skin, we are in the process of becoming "mere computers." Does our enthusiasm for computerlike models of the mind reflect our firm grasp of the computer, or rather the computer's firm grasp of us?

    We meet ourselves in our computers

    How do we begin assessing the computer as a human tool? The claims and counter-claims easily become tiresome. For every benefit of the computer you cite, I can point to a corresponding threat; and for every alarm I sound, you can herald a new opportunity. This slipperiness, in fact -- as I have already suggested -- must be our starting point. Part of the essence of a computer is its flexibility, its emulative ability, its diverse potentials. It is a universal machine. Given a technology of almost pure, open-ended potential, the machinery itself is, from a certain point of view, scarcely worth discussing. It is a template, a blank screen. Everything hinges upon what we bring to the technology, and which of its potentials we choose to realize. The one sure thing about the computer's future is that we will behold our own reflections in it.

    Even the "computer-human interface" people -- who have contributed so much to our understanding of machine design -- have failed to probe adequately the implications of the fact that we're really dealing with a human-human interface. Those were software engineers who designed that obstructive program you struggled with last week. Computer scientists conceived the languages that constrained the programmers. And certain academicians first recognized, or thought they recognized, the quintessence of their own mentality in a transistorized logic gate. Could they have done so if they had not already begun to experience themselves as logic machines? Could I, for that matter, allow the hammer in my hands to run riot if there were no answering spirit of aggression within me?

    This is why I find naive rhapsodizing about computers so disconcerting. It expresses the very sort of blithe unawareness that converts the technology into a profound threat. Machines become a threat when they embody our limitations without our being fully aware of those limitations. All reason shouts at us to approach every aspect of the computer with the greatest caution and reserve. But what incentive has our culture provided for the exercise of such caution and reserve? It's more in our nature to let technology lead where it will, and to celebrate the leading as progress.

    Of course, every invention, from television to nuclear power, tends to incarnate the will (conscious or unconscious) of its employer. And if that will is less than fully conscious, the invention wields us more than we wield it. Can anyone really doubt that we have become the tools of television far more than the reverse? But the computer ups the ante in a game already extremely perilous. It relentlessly, single-mindedly, apes us even in -- or perhaps especially in -- those habits we are not yet aware of, for it is endowed in some sense with a mind of its own.

    All of which is to say that we have been progressively distilling into the computer certain pronounced tendencies of our own minds. These tendencies are certainly related to that entire several-hundred-year history by which our culture has gained both its technological triumphs and its horrors.

    But is the computer really just a blank screen reflecting our own natures? Doesn't it invite -- even encourage -- a one-sided display of human traits? It seems undeniable that what the computer asks from us is above all else "what computes." It asks us to abstract from human experience a quantity, a logic, that it can cope with. And yet, we must acknowledge that during the past several centuries we have shown, quite independently of the computer, a strong passion for reducing all of human experience, all knowledge, to abstraction. The computer is a perfected result of this urge. Can we blame the computer for this?

    The will toward artifice

    On the one hand: the machine as an expression of the human being. On the other hand: the machine as an independent force that acts or reacts upon us. Which is it? I am convinced there is no hope for understanding the role of technology in today's world without our first learning to hold both sides of the truth in our minds, flexibly and simultaneously. The relationship between human being and machine has become something like a complex symbiosis. We who devise "thinking machines" cannot escape our own most intimate responsibility for the planet's rapidly crystallizing, electromechanical nimbus, nor can we escape the prospect of its increasing -- and potentially threatening -- independence of mind and self-will.

    In sum: if machines do not simply control society, neither can we claim straightforward control of their effects. We and our mechanical offspring are bound together in an increasingly tight weave. To substantially modify the larger pattern -- rather than simply be carried along by it -- requires profound analysis of things not immediately evident, and a difficult effort to change things not easily changed. If it is only through self-awareness and inner adjustment that I can restrict the hammer in my hands to its proper role, I must multiply the effort a millionfold when dealing with a vastly more complex technology -- one expressing in a much more insistent manner its own urgencies.

    But that is not quite all. We are not -- yet, at least -- bound to our machines by a perfectly rigid symmetry of mutual influence. The willfulness we encounter in technology, even where it has long since detached itself from us, nevertheless originates in the human being -- a fact some of the severer critics of technology overlook. /3/ So long as we can document the nonneutrality of technology -- as these critics so effectively have-- then we do not live in absolute thrall to it. For understanding is the basis of our freedom.

    Freedom is admittedly a risky business. We can choose to ignore the sound warnings of these critics; we can continue giving full rein to the half-conscious impulses we have embedded in our machines even while abdicating the kind of responsibility the critics plead for. We can, that is, finally descend to equality with our machines. This would be a fearful symmetry indeed, precluding the sort of understanding from which freedom arises and sealing off the escape from pure, mechanical determination.

    Throughout the following pages my arguments will retain a double edge. At one moment I will emphasize the determining influence we have already given to our machines; the next moment I will urge the burden of freedom. There is no essential contradiction here. A recognition of what has been determining us is the only basis for a responsible freedom.

    Nor does either side of this double truth require me to call for a mindless rejection of technology. I will, in fact, have little to say about technology as such in this book. What I really fear is the hidden and increasingly powerful machine within us, of which the machines we create are an expression. Only by first coming to terms with our own "will toward artifice" can we gain the freedom to use wisely the artifices of our creation.

    References

    1. Barfield, 1986. First published in 1926.

    2. "The Harp and the Camera," a lecture subsequently published in Barfield, 1977b: 73-4.

    3. See chapter 5, "On Being Responsible for Earth."

  • Go to The Future Does Not Compute main page
  • Go to table of contents
  • This document: http://www.praxagora.com/~stevet/fdnc/ch02.html