The Future as a Way of Life Alvin Toffler

“It’s a poor sort of memory that only works backwards.”

THE WHITE QUEEN IN Through the Looking-Glass

As more Americans travel abroad, the term “culture shock” is beginning to creep into the popular vocabulary. Culture shock is the effect that immersion in a strange culture has on the unprepared visitor. Peace Corps volunteers suffer from it in Romeo or Brazil. Marco Polo probably suffered from it in Cathay. Culture shock is what happens when a traveler suddenly finds himself in a place where “yes” may mean no, where a “fixed price” is negotiable, where to be kept waiting in an outer office is no cause for insult, where laughter may signify anger. It is what happens when all the familiar psychological cues that help an individual to function in society are suddenly withdrawn and replaced by new ones that are strange or incomprehensible.

The culture- shock phenomenon accounts for much of the bewilderment, frustration, and disorientation that plague Americans in their dealings with other societies. It causes a breakdown in communication, a misreading of reality, an inability to cope. Yet culture shock is relatively mild in comparison with a much more serious malady that might be called “future shock.” Future shock is the dizzying disorientation brought on by the premature arrival of the future. It may well be the most important disease of tomorrow.

Future shock will not be found in Indie Medicus or in any listing of psychological abnormalities. Yet, unless intelligent steps are taken to combat it, I believe that most human beings alive today will find themselves increasingly disoriented and, therefore, progressively incompetent to deal rationally with their environment. I believe that the malaise, mass neurosis, irrationality, and free- floating violence already apparent in contemporary life are merely a foretaste of what may lie ahead unless we come to understand and treat this psychological disease.

Future shock is a time phenomenon, a product of the greatly accelerated rate of change in society. It arises from the super- imposition of a new culture on an old one. It is culture shock in one’s own society. But its impact is far worse. For most Peace Corps men, in fact most travelers, have the comforting knowledge that the culture they left behind will be there to return to. The victim of future shock does not.

Take an individual out of his own culture and set him down suddenly in an environment sharply different from his own, with a different set of cues to react to, different conceptions of time, space, work, love, religion, sex, and everything else; then cut him off from any hope of retreat to a more familiar social landscape, and the dislocation he suffers is doubly severe. Moreover, if this new culture is itself in a constant turmoil of revolutionary transition, and if—worse yet—its values are incessantly changing, the sense of disorientation will be still further intensified. Given few clues as to what kind of behavior is rational under the radically new circumstances, the victim may well become a hazard to himself and others.

Now imagine not merely an individual but an entire society, an entire generation—including its weakest, least intelligent, and most irrational members—suddenly transported into this new world. The result is mass disorientation, future shock on a grand scale.

This is the prospect that man now faces as a consequence of accelerated change—the prospect of dislocation far more subtle, complex, and continuous than any we have known. Change is avalanching down upon our heads and most people are utterly unprepared to cope with it.

Is all this exaggerated? I think not. It has become a cliché to say that we are now living through a “second industrial revolution.” This phrase is supposed to impress us with the speed and profundity of the change around us. But beyond being platitudinous, it is misleading. For what is occurring now is a transformation that is, in all likelihood, bigger, deeper, and more

important than the industrial revolution. In fact, there is a growing body of reputable opinion that the period we are now living through represents nothing less than the second great divide in human history, comparable in magnitude only with that first great break in historic continuity, the shift from barbarism to civilization.

This idea has begun to occur with increasing frequency in the writings of scientists, social critics, economists, business analysts, and others. For example, Sir George Thomson, the British physicist and Nobel prize winner, suggests in The Foreseeable Future that the nearest historical parallel with today lies in the “invention of agriculture in the neolithic age.” John Diebold, the automation expert, is among many who assert that we are seriously underestimating the degree of change that will occur in our society as a result of cybernetics. “It is the rate of change itself which I believe to be the most significant phenomenon of all,” he writes. He warns that “the effects of the technological revolution will be deeper than any social change we have experienced before.” Kurt W. Marek, the student and popularizer of archaeology, observes that “we, in the twentieth century, are concluding an era of mankind five thousand years in length. . . . We are not, as Spengler supposed, in the situation of Rome at the beginning of the Christian West, but in that of the year 3000 B.C. We open our eyes like archistoric man, we see a world totally new.” Lewis Mumford in The City in History writes ominously about the coming of “Post- historic Man.”

The most recent and elaborate statement of this theme has come from Kenneth Boulding, an eminent economist and an imaginative social thinker. Referring to the twentieth century as the second great transition in the history of mankind, Boulding writes:

“The first transition was that from precivilized to civilized society. This is a transition that is still going on in some parts of the world, although it can be regarded as almost complete. Precivilized society can now be found only in small and rapidly diminishing pockets in remote areas. It is doubtful whether more than 5 per cent of the world’s population could now be classified as living in a genuinely precivilized society.

“Even as the first great transition is approaching completion, however, a second great transition is treading on its heels. It may be called the transition from civilized to postcivilized society.”

Most of those who make this comparison merely state it and let it go at that. Boulding, in a series of lectures and articles and in his latest book, The Meaning of the Twentieth Century, attempts to justify this dramatic view of the present as a truly nodal moment in human history. Among other things, he points out that “as far as many statistical series related to activities of mankind are concerned, the date that divides human history into two equal parts is well within living memory.” In effect, he is saying, our century represents The Great Median Strip running down the center of human history. This is a startling assertion, but it is borne out in many ways.

Dr. Homi Bhabha, a distinguished Indian atomic scientist and the Chairman of the first International Conference on the Peaceful Uses of Atomic Energy, once analyzed the rate at which energy consumption is rising. “To illustrate,” he said, “let us use the letter Q to stand for the energy derived from burning some thirty- three thousand million tons of coal. In the eighteen and a half centuries after Christ, the total energy consumed averaged less than half a Q per century. But by 1850, the rate had risen to one Q per century. Today, the rate is about ten Q per century.” This means, roughly speaking, that half of all the energy consumed by man in the past two thousand years has been consumed within the last one hundred.

A comment from Boulding serves almost as a footnote to this finding: “For many statistical series of quantities of metal or other materials extracted, this date [i.e., the dividing line] is about 1910. That is, man took about as much out of mines before 1910 as he did after 1910.”

The same is true of another kind of mining—the mining of knowledge. Information, itself a prime catalytic force in the process of social change, is proliferating at a mind- numbing rate. According to Charles P. Bourne of the Stanford Research Institute, the number of significant journals now being published around the world is on the order of fifteen thousand with perhaps a million significant papers in them each year. These figures do not include books and others forms of publication. Information specialists say that the sheer quantity of information is now doubling every ten years.

Population is, too. Boulding calls to our attention the striking fact that about 25 per cent of all the human beings who have ever lived are now alive, and it is clear from other sources that before long that figure, too, will reach 50 per cent. Sir Julian Huxley has pointed out that it took Homo sapiens at least a hundred thousand years to reach a population of two- thirds of a billion. “This was in 1650,” Huxley notes. “From this date, it took nearly two hundred years for world population to double itself once, but the second doubling took only one hundred years. It has more than doubled itself again in the sixty- four years of the present century and will certainly double itself still again within the next forty years.”

Similarly, ours is the century in which, at least in many countries, agriculture has ceased to be the dominant economic activity. This is particularly important because of the close historical relationship between agriculture and what we fondly call civilization.

Without the development of agriculture, Boulding reminds us, there could have been no food surplus to support “knowledge workers” in society and there would have been no coalescence of the city. But the surplus food supply made possible by agriculture in ancient societies was meager at best. “Whether it was Sumeria, Egypt, Greece, Rome, Ancient China, the Incas, or the Mayans, all these were societies based on a food surplus . . .

that rarely exceeded 20 or 25 per cent of the total product. In these circumstances three quarters to four fifths of the population [had to devote themselves to] agriculture or other food production, and these people [could] barely produce enough to feed the remaining quarter or fifth of the population. . . Almost all the cities of classical civilization were within a few weeks of starvation at any time. . .

In contrast, in the United States today only 10 per cent of the population is engaged in agriculture, and this small percentage is capable of producing more food than anyone knows what to do with. Moreover, it is obvious, as Boulding observes, that “if present trends continue it will not be long before we can produce all the food that we need with 5 per cent, or even less, of the population.”

In the United States a further stage has been reached. In 1900 the nonfarm- labor force already outnumbered the agricultural labor pool. But of this non- farm group, 57 per cent were in blue- collar occupations. By 1960, more Americans were wearing white collars than blue. This, too, represented a qualitative change in society. For the first time in human history a society, having shrugged off the economic domination of agriculture, proceeded to shrug off the domination of manual labor.

Still another “first” is within sight. According to Professor Kingsley Davis, an authority on urbanization, by the year 2000—only thirty- five years off— one quarter of all the people in the world will be living in cities of a hundred thousand or more. By 2050 the figure will be one half of the world’s population. What this shift will mean in terms of values, family structure, jobs, politics, and the structure of cities themselves, is staggering.

All these changes represent giant spurts in movements that have been continuous, in one form or another, since history began. If we add to these truly historic jumps the list of developments or processes that simply did not begin until the twentieth century—air travel and space flight, television, the development of nuclear energy, the invention of the computer, the discovery of DNA with its possibilities for the control of evolution—the sharpness of the break with the past becomes even clearer. Given these, it becomes impossible to sustain the argument that what is happening now is anything like “normal” progress, even for the kind of industrial society we have known for the past century. It is not merely a “second industrial revolution.” Viewed as a violent break with historic continuity, our age takes on a significance that few ages in the past have had.

Nevertheless, isn’t it possible that the life of the individual will remain largely untouched? The answer is that the mood, the pace, the very “feel” of existence, as well as one’s underlying notions of time, beauty, space, and social relations will all be shaken.

Take the matter of tempo. John Diebold never wearies of pointing out to businessmen that they must begin to think in terms of shorter life spans for their products. Smith Brothers’ Cough Drops, Calumet Baking Soda,

Ivory Soap, have become institutions by virtue of their long reign in the market place. In the days ahead, he suggests, few products will enjoy such longevity. Corporations may create new products knowing full well that they will remain on the market for only a matter of a few weeks or months. By extension, the corporations themselves—as well as unions, government agencies, and all other organizations—may either have shorter life spans or be forced to undergo incessant and radical reorganization. Rapid decay and regeneration will be the watchwords of tomorrow. The men and women who live, work, and play in a society where whole categories of merchandise seen on the shelves of the nearby store last month are no longer manufactured today and where their own place in the bureaucratic structure of society is being constantly reshuffled, will have to use entirely new yardsticks for measuring the passage of time in their own lives.

Even our conceptions of self will be transformed in a world in which the line between man and machine grows increasingly blurred. It is now almost certain that within a matter of years the implantation of artificial organs in human bodies will be a common medical procedure. The human “body” in the future will often consist of a mixture of organic and machine components. What happens to the definition of man when one’s next- door neighbor or oneself may be equipped with an electronic or mechanical lung, heart, kidney, or liver, or when a computer system can be plugged into a living brain? How will it “feel” to be part protoplasm, part transistor? What new possibilities will it open? What limitations will it place on work, play, sex, intellectual or aesthetic responses? How will it feel to have information transferred electronically between computer and brain? What happens to mind when body is changed? Such fusions of man and machine—called “Cyborgs”—are closer than most people suspect. As Fortune magazine, not ordinarily given to overstatement, has reported, “these are not just fantasies; they are extensions of work already being done in laboratories.”

Allied to this are the implications of research being done today on the operations of the brain. For the first time, we are beginning to understand something about how the brain functions. The scientist and philosopher Percy Bridgman has observed that “thinking is done with the brain, which is a nervous system,” and that “any possible thought must be subject to the limitations imposed by the system which produces it . . . all human thinking since the beginning of thought has been ignoring inexorable limitations. When we find what the limitations are I believe that our philosophy, religion, and politics will be revolutionized.”

Similarly, as Professor Boulding points out, our very conception of life itself will be smashed and replaced. “The implications of the biological revolution alone are immense,” he says. “We have the code of life. We can’t write it yet, but that cannot be too far off. Artificial virus is close. This is the synthesis of life. Before long we may well crack the secret of aging. What happens when we do? Even if life were increased by twenty or forty

years, it would shatter every relationship and institution we now know—the family, sex, marriage, the relationship between age groups, education. The work now being done raises the question: Is immortality possible?

“Even if what is involved is only a finite extension of the life span, it creates all kinds of new ethical problems. Who gets immortality or added years? On what basis? The medical profession today already faces unusual ethical problems in connection with deciding which of many patients who need it will be lucky enough to be plugged into the new artificial kidneys. There will be other moral issues—completely new to us. All this sounds utterly farfetched, but I would not be at all surprised to see some of those things come to pass by the end of the century. Sound close? Remember that it was less than thirty- five years from Los Alamos and Hiroshima back to Rutherford’s basic discoveries about the nucleus of the atom. And things are moving much faster today.”

Work, too, is being transformed. There was a time when for men “the job” was a central organizing principle of life. One’s living arrangements, one’s hours, income, everything, was determined, or at least heavily influenced, by the nature of one’s job. In the cybernetic society that lies around the corner the entire present occupational structure of society will be overthrown. A great many professions simply will not exist, and new ones as yet unpredicted will spring up. This must radically affect the texture of everyday life for millions.

In the past it was possible to know in advance what occupations would exist when a boy became a man. Today the life span of occupations has also been compressed. The computer programmer, who was first heard of in the 1950’s, will be as extinct as the blacksmith within a matter of years, perhaps by 1970. Individuals now train for a profession and look forward to remaining in that profession for the entire period of their working life. Yet within a generation the notion of serving in a single occupation for one’s entire life may seem quaintly antique. Individuals may need to be trained to serve successively in three, four, or half a dozen different professions in the course of a career. The job will no longer serve as man’s anchor and organizing principle.

The shrinking role of the job will be further diminished by the extension of leisure. Within half a century the productive machinery of society may need so few tenders that a great many individuals—and not just women—will be born who will simply never work a day in their lives at a paid job. Those who do work will, without question, work shorter hours than we do today. The work week has been cut by 50 per cent since the turn of the century. It is not out of the way to predict that it will be slashed in half again by 2000. This is not so far off that we can shrug the implications aside. How will we spend the long hours away from work? How will the role of the father be transformed with the loss of his historic role as family provider? What new forms of leisure will emerge? In a society in which work has become less central than it has been, how will we determine the esteem in which we hold a man?

Whether any or all of these developments occur precisely in the way suggested is not important. The commanding point is that we can anticipate volcanic dislocations, twists and reversals, not merely in our social structure, but also in our hierarchy of values and in the way individuals perceive and conceive reality. Such massive changes, coming with increasing velocity, will disorient, bewilder, and crush many people.

We think of ourselves as sophisticated and well educated, but how well prepared are we, as a society, to cope with the sudden new sensations, pains, intellectual turnabouts, eruptions, and shifts in perception that are likely to confront us as we speed forward into a culture in which computers can learn, and can improve upon their own performance, in which man is no longer the only manifestation of high- level intelligence on the face of the earth, and in which, in fact, he may come crash up against the realization that his globe is not the only inhabited parcel of real estate in the universe?

“We all need,” says psychiatrist John R. Rees, Honorary President of the World Federation of Mental Health, “a sense of our own identity, that we have a certain place in the structure of our society, and that we fit into that. We have constantly to change our identity when we move into a different sort of world, and we often get considerable confusion of identity, something that is disturbing to many people. They cannot take it. . .”

If the degree of change and the speed of that change is even remotely close to what I have suggested, it must be obvious that the shift to Professor Boulding’s “postcivilization” may place unendurable stress on a great many people. For the current upbringing of most people, and the subtly inculcated sense of time that comes with it, are both inimical to adaptability.

A psychologist named Lawrence LeShan some years ago conducted an experiment in time orientation among groups of subjects from different socio- economic classes. Those from what he designated as the lower class tended to look at time in terms of immediate action and immediate gratification. The future was nebulous and unpredictable. There was a tendency for them to believe, as many Arabs do, that any attempt to predict the future is not only futile but evidence of mild lunacy.

In the sample highest up the economic ladder, LeShan found a pronounced identification with the past. Johnny is told, if he flunks his civics course, that his grandfather would have been ashamed of him. Family meals are set at fixed, traditional hours, rather than when hunger strikes or at convenient times. To a degree, lives are lived to carry out sequences started by previous generations.

Among the middle- class group, LeShan found a totally different time orientation. These subjects tended to think in terms of longer time sequences than the lower- class group. Children were taught to orient themselves to

ward goals fixed in the fairly distant future. Johnny is told to study not because his grandfather would be ashamed of him for flunking but because flunking might make it impossible for him to achieve the goal of becoming a lawyer or doctor or whatever.

But while these time orientations differ, they all assume either an unpredictable or an unchanging future. The assumption of the middle- and upper- class samples is that the contours of society will stay the same in the future. When a middle- class mother talks about Johnny becoming a lawyer, she is deceiving herself and her son because she has no conception of what being a lawyer will mean two decades hence. She has no notion of the potential, for example, of “jurimetrics.” Jurimetric research, a product of the computer revolution, involves electronic data processing, information storage and retrieval (collecting, collating, and making immediately available all relevant legal precedents), the analysis of legal decisions, and the use of symbolic logic. It may have immense consequences for the making and administration of law, as well as the pleading of cases. Mama does not know it, but the lawyer of the future will not fit the mold she has in mind.

The fact is—and simple observation of one’s own friends and associates will confirm it—that even the most educated people today operate on the assumption that society is relatively static. At best they attempt to plan by making simple straight- line projections of present- day trends. The result is unreadiness to meet the future when it arrives. In short, future shock.

What, if anything, can be done to lessen this shock and the disorientation and disability that come with it? Society has many built- in time spanners that help to link the present generation with the past. Our sense of the past is developed by contact with the older generation, by our knowledge of history, by the accumulated heritage of art, music, literature, and science passed down to us through the years. It is enhanced by immediate contact with the objects that surround us, each of which has a point of origin in the past, each of which provides us with a trace of identification with the past.

No such time spanners enhance our sense of the future. We have no objects, no friends, no relatives, no works of art, no music or literature, that originate in the future. We have, as it were, no heritage of the future.

Despite this, there are ways to send the human mind arching forward as well as backward. We need to begin by creating a stronger future- consciousness on the part of the public, and not just by means of Buck Rogers comic strips and articles about the marvels of space travel or medical research. These make a contribution, but what is needed is a concentrated focus on the social and personal implications of the future, not merely on its technological characteristics.

If the contemporary individual is going to have to cope with the equivalent of millenniums of change within the compressed span of a single lifetime, he must carry within his skull a reasonably accurate (even if gross) image of the future.

Medieval men possessed one—an image of the afterlife, complete with vivid mental pictures of heaven and hell. We need now to propagate a dynamic, nonsupernatural image of what temporal life will be like, what it will sound and smell and taste and feel like in the fast- onrushing future.

To create such an image and thereby soften the impact of future shock, we must begin by making speculation about the future respectable. Instead of deriding the “crystal ball gazer,” we need to encourage people, from childhood on, to speculate freely, even fancifully, not merely about what next week holds in store for them but about what the next generation holds in store for the entire human race. We offer our children courses in history; why not also make a course in “Future” a prerequisite for every student, a course in which the possibilities and probabilities of the future are systematically explored, exactly as we now explore the social system of the Romans or the rise of the feudal manor?

We train our Peace Corps volunteers by attempting to give them advance knowledge about the conditions and culture of the country to which they are assigned. In doing so, we minimize culture shock. Why not devise an education designed to minimize future shock?

We do not have a literature of the future, but we do have literature about the future, consisting not only of the great utopias but also of contemporary science fiction. Science fiction is held in low regard as a branch of literature, and perhaps it deserves this critical contempt. But if we view it as a kind of sociology of the future, rather than as literature, science fiction has immense value as a mind- stretching force for the creation of future- consciousness. Our children should be studying Arthur C. Clarke and Robert Heinlein and William Tenn not because these writers can tell them about rocket ships and time machines but, more important, because they can lead young minds through an imaginative exploration of the jungle of political, social, psychological, and ethical issues that will confront these children as adults. Science fiction should be required reading for Future I.

But science fiction is largely speculative, and while speculation is useful, we must go beyond it. We must kill, once and for all, the popular myth that the future is “unknowable.”

At the last meeting of the American Association for the Advancement of Science a University of Colorado researcher named George A. Dulk predicted that a burst of radio signals from the planet Jupiter would be received in Colorado at midnight, December 31, 1964. On New Year’s Eve, at 11:40 P.M.—just a few minutes ahead of schedule—a radio signal that sounded like the sizzle of a frying pan was picked up by the eighty- five- foot radio telescope at the National Center for Atmospheric Research. Since the signals did not come from intelligent creatures at the other end—there was never any question of that—the fulfillment of Dulk’s prediction caused only a minor stir. For predictions—even startling predictions—of this kind are, of course, the bread and butter of science. They are entirely routine.

Every day brings improvement in man’s ability to peer into the darkness ahead, and this is true in the social as well as the “hard” sciences. It is now possible to predict with fair accuracy a wide range of social phenomena—such as the number of babies to be born in Indiana in the first three months of 1970, the way in which the housewife will divide her income between purchases and savings next year, the percentage of the national production that will be allocated to wages in any given year, the number of people who will die of bathtub accidents, and, within a percentage point or two, the number of Americans who will vote for one Presidential candidate over another.

It would, of course, be foolish to oversell the ability of science, “hard” or “soft,” to foretell the future. But the danger today is not that we will overestimate this ability, but that we will underutilize it. “Instead of thinking that either we can or cannot predict the future,” wrote William F. Ogburn, one of the century’s great students of social change, “we should admit into our thinking the idea of approximations, that is, that there are varying degrees of accuracy and inaccuracy of estimate.” Furthermore, he pointed out, complete accuracy is not necessary in many fields. A rough idea of what lies ahead is better than more.

Thus, in addition to encouraging the speculative turn of mind, we must undertake to train individuals in the techniques of prediction and of scientific method in general. This does not mean more courses in biology or physics or chemistry. It means more attention to philosophy and logic, and perhaps special courses in “how to predict.” How many of us, even among the educated public, understand the meaning of a random sample, or of probability, or of prediction by correlation? This does not mean that all of us need to be statisticians or logicians or mathematicians. But the principles of scientific prediction can and should be grasped by every youngster who graduates from high school, for these are not merely the tools of “scientific research,” they are powerful instruments for dealing rationally with the problems of everyday existence. Ignorance of them constitutes a form of functional illiteracy in the contemporary world.

The willingness to speculate freely, combined with a knowledge of scientific method and predictive techniques, is coming to be valued in precisely the place where, in the past, the greatest emphasis has been placed on conservatism and feet- on- the- ground “realism.” Corporations are beginning to hire men who are, in effect, staff prophets, men willing to look ahead ten years or more. “A few years ago,” says Tom Alexander in Fortune magazine, “most such people were called crackpots. A lot of bitter recent experience has shown that such crackpots are too often likely to be right, and that it is usually the ‘sound thinkers’ who make fools of themselves when it comes to talking about the future.” Shortly after reading this, I learned of a giant corporation that is looking for a science- fiction writer to come in, analyze its operations, and write a report about what the company will look like fifty years from now.

The idea of hiring “prophets,” “wild birds,” or “blue- skyers,” as they are variously known, is one that should be adopted not only by corporations but by all the major institutions of our society. Schools, cultural organizations, and government agencies at the city, state, and federal level, should begin to emulate industry in this respect. It would be refreshing and healthy if some of these blue- skyers were to be retained by Congress and state legislatures. They should be invited into classrooms to lecture and lead discussions all over the country. This could have more than direct educational value. It would, I believe, thrill and inspire our children, many of whom now look forward to the future with foreboding and a sense of futility.

Finally, we might consider creating a great national or international institute staffed with top- caliber men and women from all the sciences and social sciences, the purpose of which would be to collect and systematically integrate the predictive reports that are generated by scholars and imaginative thinkers in all the intellectual disciplines all over the world. Of course, those working in such an institute would know that they could never create a single, static diagram of the future. Instead, the product of their effort would be a constantly changing geography of the future, a continually recreated overarching image based on the best predictive work available. The men and women engaged in this work would know that nothing is certain; they would know that they must work with inadequate data; they would appreciate the difficulties inherent in exploring the uncharted territories of tomorrow. But man already knows more about the future than he has ever tried to formulate and integrate in any systematic and scientific way. Attempts to bring this knowledge together would constitute one of the crowning intellectual efforts in history—and one of the most worthwhile.

Man’s capacity for adaptation may have limits, but they have yet to be defined. If, as Margaret Mead has shown, the Manus of New Guinea could, within a twenty- five year period, pass from Stone Age primitivism into a twentieth- century way of life, and do so happily and successfully, modern man should be able to traverse the passage to postcivilization. But he can accomplish this grand historic advance only if he forms a better, clearer, stronger conception of what lies ahead. This is the only remedy for the phenomenon of future shock.