Some while back a reader urged upon me this principle: "Anything we do that can be automated should be automated". It's a principle that appeals to the common sense of many people today, and complements the notion that machines can unburden us of the more tedious and mechanized work, leaving us free to occupy ourselves with "higher" and more "human" tasks.
Appealing as the reader's suggestion is, I'm convinced that it readily promotes an unhealthy relation to technology. Here's why: First, it obscures the truth that nothing we do can be automated. Sure, I know that a computer can "add two plus two", but what it does is not what we do. It does not bring consciousness to the act. It is not exercising and therefore strengthening certain skills and cognitive capacities. It requires no attention, no will power, no motivation, no supportive metabolism, no memory, no imagination, and no sympathetic muscle movements. Nor is it engaged in any larger purpose when it carries out the computation -- or any purpose at all. It is amazing to see how readily we forget these things today and equate a computer's action with human performance.
When a machine "does" what we do, we typically mean that something about the structure of the machine's activity can be mapped (by us) to a narrow set of formal features abstracted from our own activity. For example, a pattern of electrical pulses can be seen as analogous to the formal structure of a problem in arithmetic addition. This sort of mapping happens to be very useful, but no worthwhile effort to assess the usefulness can begin with the false notion that the machine is doing what we do.
Actually, the more relevant fact is that the machine displaces and eliminates from the situation much that we do, leaving us to consider (1) how we might compensate for the disuse of our own capacities, and (2) how the entire context and significance of the work has been altered by its reduction to those few formal features.
It's all too easy for the facile calculations of the spreadsheet software to begin narrowing a business' conception of its own work, even though the business may have begun with a richly meaningful and idealistic set of intentions. Intention doesn't enter into the software's calculations, and as that software plays an ever greater role in the business, the question is, "Where will the guiding intentions come from -- or will we simply allow them to disappear as we yield to the machine's empty guidance?"
There Is No Stopping Place
If the first problem with our reader's formulation is that nothing we do can be automated, the second problem is that everything can be automated. That is, once you equate the kind of reduction I've been talking about with "automating human activity", there's no line separating things that can be automated from those that cannot. So "automate whatever can be automated" provides no guidance whatever. In the reduced sense that applies, everything can be automated.
As many have pointed out, you can abstract some sort of formal structure from any activity you can describe (the description itself embodies a syntactic structure) and this structure can be impressed upon a machine. So as soon as you are convinced you have automated the simplest human activity, you are climbing a ladder possessing no special rung to mark a stopping place. If a calculator "does what we do", then a computer can in one sense or another do what a judge or composer or physicist does. If we do not pay attention to the difference between the computational abstraction and the human reality in the simple cases, nothing will require our attention to those differences in the "higher" cases.
Further, the more you automate, the more you tend to reduce the affected contexts to the terms of your automation, so that the next "higher" activity looks more and more like an automatic one that should be handed over to a machine. When, finally, the supervisor is supervising only machines, there's no reason for the supervisor himself not to become a machine.
So the idea that automation relieves us from grunt work in order to concentrate on higher things looks rather like the opposite of the truth. Automation tends continually to reduce the higher work to mechanical and computational terms. At least, it does this when we lose sight of the full reality of the work, reconceiving it as if its entire significance lay in the few decontextualized structural features we can analogize in a machine. (In a machine-driven world, we are always pressured toward this reconceptualization.) But if, on the other hand, we do not lose sight of the full reality of the work, then the "lower-level" stuff may look just as much worth doing ourselves as the "higher" -- in which case we have to ask, "What, really, is the rationale for automating it?"
This is not to say that, for example, endless hours spent manually adding columns of numbers would prove rewarding to most people. But where we typically run into such tasks is precisely where reductive technologies (such as those involved in the machinery of bookkeeping and accounting) have already shaped the work to be done. In general, the grunt work we want to get rid of is the result of automation, and while additional automation may relieve us of that particular work, it also recasts a yet wider sphere of work in terms seemingly fit only for automation. After all, the ever more sophisticated accounting software requires ever more extensive inputs, so more and more people in the organization find themselves caught up in paper-shuffling (or electronic file-shuffling).
It's where automation has not already destroyed the meaningfulness of the low-level work that we discover how high-level it can really be. The farmer may choose not to abandon his occasional manual hoeing -- not because he is a hopeless romantic, but because there is satisfaction in the simple rhythms, good health in the exercise, and essential knowledge of soil and crop conditions in the observations made along the way. What will provide these benefits when he resides in a sealed, air-conditioned cab fifteen feet off the ground?
A Strengthened Inner Activity
You may ask, then, "Should nothing be automated?" I didn't say that! I've only suggested that we avoid deluding ourselves about automation freeing us for higher things. Have we in fact been enjoying such a release? Any investigation of the matter will reveal that the machine's pull is most naturally downward. It's hard to relate to a machine except by becoming machine-like in some part of ourselves.
When we yield ourselves to automatisms, we become sleepwalkers. But if instead they serve as foils for our own increased wakefulness, then they will have performed a high service. After all, downward forces, too, can be essential to our health. We couldn't walk upright without the force of gravity to work against, and our muscles would atrophy without the effort.
It is, I think, inescapable that we should automate many things -- and, of course, there are many pleasures to be had in achieving this. When I said above that an automating mentality will not find any clear stopping place, I did not mean to imply that there should be such a stopping place --certainly not in any absolute sense. In fact, I think it's wrong to imagine a stopping place defined in terms of the "objective" nature of the work.
Everything is potentially automatable in the restricted sense I have indicated, and pretending there is a natural stopping place only encourages the kind of mindless automation that is the real problem. What is crucial is for us to be aware of what we're doing and to find within ourselves the necessary compensations. We have to struggle ever more determinedly to hold on to the realities and meanings our automated abstractions were originally derived from. That is, we must learn to bring the abstractions alive again through a strengthened inner activity -- a tough challenge when the machine continually invites us to let go of our own activity and accept the task in reduced terms!
The limits of our compensatory capacities will always suggest wise stopping places, if we are willing to attend to those limits. But not absolute stopping places; they will shift as our capacities grow.
Are we currently setting the bounds of automation wisely? You tell me. Has the accounting software and the remarkable automation of global financial transactions been countered by our resolve to impose our own conscious meanings upon those transactions? Or, rather, does the entire financial system function more and more like a machine, merely computing an abstract bottom line?
Well, if you're looking at the dominant institutions, I imagine your answer will be pessimistic. But perhaps the most important developments for the future are the less conspicuous ones -- for example, the alternative food and health systems, the growing interest in product labeling, the investing-with-a-conscience movement. What's essential in these is the determination to restore the automated abstraction -- for example, the nutrient in the processed food, the number in the accountant's spreadsheet -- to the meaningful context it was originally ripped out of.
Holding the Balance
I guess the sum of the matter is that the restoration entails a gesture exactly opposite to the one expressed in, "if it can be automated, it should be". It's more like, "if it can be re-enfleshed, it should be". As long as these two movements are held in balance, we're probably okay. We should automate only where we can, out of our inner resources, re-enliven. For example, we should substitute written notes and email for face-to-face-exchanges only so far as we have learned the higher and more demanding art of revivifying the written word so that it reveals the other person as deeply as possible and gives us something of his "presence". Of course, this is not the way most of us relate to email -- not even when the frenetic, email-influenced pace of work would allow it.
I suppose few would quarrel with the proposition that our society is much more gripped by the imperative to automate than the imperative to re-enflesh. Certainly this is ground for worry, given that the push for automation alone is a push to eradicate the human being.
The threat of eradication was Bill Joy's concern in his notorious "Wired" article. I share his concern, but it seems to me that the effort to define a fixed stopping place is inherently untenable; it just can't be done with any consistency. Nor would we expect that it could be done if we had grown accustomed to think organically and imaginatively, in terms of movement, balance, tension, polarity (exactly what our machines train us away from!).
It seems to me that some such awareness as I have tried to adumbrate here is the prerequisite for our avoiding the eventual loss of ourselves. It must be an awareness of our own, machine-transcending capacities. We must exercise these in a living, tensive balance as we counter the pull of all the mechanisms around us.
This is exactly the awareness that many of Joy's critics have refused. It seems obvious to Ray Kurzweil (author of "The Age of Spiritual Machines") that digital technologies will transform human consciousness, and not at all obvious that a transformed human consciousness is the only thing that can sustain future technologies -- just as transformations of human consciousness have been required to generate and sustain all earlier technologies. There's something self-fulfilling in Kurzweil's prophecies; when you lose sight of the machine-transcending qualities of your own mind, it is not surprising that you find yourself increasingly susceptible to machine-like influences.
In other words, one way for us to transform our powers of consciousness is to abdicate them. Then it really does become reasonable to see ourselves in a competition, perhaps even a desperate competition, with our machines. This is the inevitable conclusion of the single-minded drive to automate everything.
Joy's alarm is justified. But our core response, while it will certainly touch policy domains, must arise first of all in that place within ourselves where we are inspired to re-enflesh whatever can be re-enfleshed. To focus instead merely on stopping automation is already to have accepted that the machine, rather than our own journey of self-transformation, is the decisive shaper of our future. Yes, we urgently need to find the right place for our machines, but we can do so only by finding the right place for ourselves.
This article originally appeared in "NetFuture", a freely distributed newsletter dealing with technology and human responsibility. It is published by The Nature Institute, 169 Route 21C, Ghent NY 12075 (tel: 518-672-0116). Postings occur roughly every couple of weeks. The editor is Steve Talbott, author of "The Future Does Not Compute: Transcending the Machines in Our Midst".
I would like to comment about your article "To Automate or Re-enflesh?" in NetFuture 122. All the readers of your article are adults and most are probably willing to at least take the warnings seriously. Nevertheless, the impulse to automate everything automatable, under the impression that to do so is beneficial, is a widespread phenomenon. We adults can say: Wait a minute - do we really want this? But what about the children - who have no way of knowing what is happening to them? The pervasiveness of automation is relatively new, so we grew up in a milieu quite different from the one our children are experiencing. We sit at home working and playing with computers, and our kids, those imitators par excellence, naturally want to do the same. And they do, in their own way - especially with the games. Do we forbid? Difficult, when they are only copying us. At most we can impose limits of time and content if we don't wish to be looked upon as scrooges. But what about the schools? There is a campaign afoot to computerize education - from kindergarten on up - and this is the most dangerous. If the adult world is on the road to total automation, can not at least children be spared, can not schools wait until children reach the appropriate age for automation? You mentioned the necessity for finding reasonable stopping places. I suggest that the most urgent stopping place is in school - at least in pre-and primary schools. Difficult, I know, because this is an extremely lucrative business for soft and hardware companies who, together with the politicians, have convinced parents that their kids should be computer literate before their milk-teeth fall out. In Argentina, where I live, the president has promised that before his mandate ends there will be a computer in every classroom; this in a country where many children can't even afford pencils and notebooks and the educational system is a shambles. Your excellent article will be reprinted in the next issue (November-December) of SouthernCross Review.