AccessMyLibrary provides FREE access to millions of articles from top publications available through your library.
Relying only on explanations based on impersonal social forces and cultural imperatives that are viewed as inevitably leading to particular human behaviors is incomplete social analysis. Any adequate theory of modern society must include people as active, purposeful, and innovative beings whose future-oriented behavior helps create not only their own future but also the social order itself. Social research directed at the investigation of such human agency contributes to our knowledge of futures thinking and accountability. Although people produce consequences for which they ought to be held accountable, they often do so only more or less competently. Their competence can be improved by teaching them the principles of futures thinking. They can become more responsible actors by learning to search more fully for possible futures, to forecast probable futures more accurately, and to make judgments of preferable futures more objectively.
I could write a book or two--and probably already have (Bell, 1997)--in response to the five questions posed by Jim Dator, the editor of this special issue. There is far too much that can be said to fit into this short article. Thus, I will limit my answers to a brief account of only a few points, focusing mostly on three of the assumptions underlying futures studies:
1. Humans by their behavior constantly shape their natural and social
environments and, in so doing, shape their own future, although not always
in ways that they intend or understand.
2. Disciplined and valid prospective thinking can help people shape
their environments and their futures effectively and responsibly.
3. Explicit and objective moral analysis can help people responsibly create
As prologue, let me say that I began teaching a course in futures studies at Yale with some form of future in the title (e.g., Futuristics, Futures Research, Social Change and the Future) in 1967. Moreover, I confess that during the past three decades in all the courses that I taught on whatever topic--from Introductory Sociology, to Class, Race, and Nation in the Caribbean, and even to the Logic of Social Research--I always included some of the principles of futures thinking. I did so at the risk of appearing to be yet another academic imperialist pushing his or her latest intellectual infatuation because I believed that, despite the existence of a social forecasting industry and the general acceptance of prediction as an indicator of valid scientific knowledge, competent futures thinking was largely missing from the mainstreams of the various social sciences (Bell, 1974). No social science, I believed, could be fully acceptable without a healthy dose of futures thinking.
Have I mellowed over the years and softened this view? Definitely not. To the contrary, I would go further today and make a stronger assertion: No college education is adequate unless it includes some systematic study of the concepts and principles of the futures field. One reason is that self-conscious futures thinking helps people become more responsible for their actions. Another reason is that any understanding of contemporary social change, the nature of the modern social world, and key features of the coming future is dangerously incomplete without the insights provided by futures studies. In this article, I try to give a few explanations of why this is so.
THE INDIVIDUAL AND SOCIETY
During my first year in graduate school at the University of California, Los Angeles in 1949, I read that it
has been the contention of sociologists from Auguste Comte and Lester F. Ward
to [Pitirim A.] Sorokin that the chief justification of sociology is the
guidance it can furnish to public officials and private citizens relative to
building a better social order. (Barnes, 1948, p. xi)
As an aspiring sociologist, I was challenged by these words, imagining a future professional career providing such guidance.
But I was also puzzled. Only two pages earlier in the same book, I had read that different sociologists had "highly contrasting conceptions as to the possibilities of social planning"--that Comte, Morgan, and Ward believed "that the main purpose of sociology is to facilitate planned progress," whereas "Spencer, Sumner, and Gumplowicz" held an opposite view and that, for them, the great practical service of sociology is to warn about the "danger of the notion that man can facilitate and hasten social progress through deliberate action" (Barnes, 1948, p. ix).
Although this account, I soon learned, is an oversimplification, it nonetheless points to contradictory views, sometimes unacknowledged and even denied, that still exist among sociologists today. Some sociologists--probably the vast majority--view society as shaping the individual, providing him or her with the illusion of having autonomous choices, while, in fact, being a system of social control and cultural forces that more or less inevitably determine his or her beliefs, attitudes, and behaviors.
Other sociologists, to the contrary, view society as a product of individual and collective choice and decision. For them, the actions and interactions of purposive individual actors importantly shape and construct the social order itself. Although admitting that there are often unintended and unanticipated consequences that require constant correction, they see social change largely as the result of deliberate action or inaction. They view society, as it actually is at any given time, as problematic, merely one of many possible outcomes that could have resulted from what individuals, separately and together, might have decided to do.
As stated, of course, these are extreme views. Most sociological works fall somewhere between them, rightly acknowledging some truth in each. Yet even in the most sophisticated writings we can find a preference for one or the other perspective. For example, Vaughan (1996) highlights the causal influences of organizational structures and culture in her explanations of the events that led to the explosion of the space shuttle Challenger. As we all know from the extensive media coverage at the time, 73 seconds after launch at Cape Canaveral, Florida, on January 28, 1986, the Challenger disappeared in a fireball and huge cloud of smoke, then dropped into the Atlantic, killing all seven crew members (Vaughan, 1996, p. 7).
Vaughan (1996) directs our attention to "the relentless inevitability of mistake in organizations" (p. xv) to how social determinants systematically shaped the behavior of the individuals involved in the fatal decision to launch. She does acknowledge that at the very top levels of management, some individuals could have behaved differently and, thus, could have acted to avoid the disaster. Yet she sees no merit in searching for individual moral responsibility as an explanation of the catastrophe and the deaths that occurred. Even though she recognizes how the work culture had been produced at NASA before the launch decision, in her analysis of the decision itself she views culture as fixed and determinant. For example, she speaks of "cultural scripts" and views people as simply playing roles that have been scripted in advance by social and cultural imperatives. Thus, individuals, in her view, can no more be held accountable for their acts than can puppets.
Others have contested this view. Allinson (1997) in his review of Vaughan's book points out that things could have happened differently. The explosion resulted because the giant rubber gaskets designed to keep combustible hot gases from leaking out during liftoff, known as O-rings, failed after exposure to the cold weather. But other, better, technology than the O-rings could have been employed in the original design and, in fact, had been proposed. Given the O-ring technology, the engineers who knew it best recommended against launching, but the flight managers failed to heed their advice. Additionally, the flight managers could have informed the senior management of the engineers' opposition to launching, but they did not. If the senior managers had been informed, they could have vetoed the decision to launch, as, indeed, they later said they would have. The lives of the astronauts could have been saved if the Challenger had been equipped, as it could have been, with an abort system and parachute-descent package.
Also, Vaughan (1996) takes great pains to show that the decision to launch was not a deviant act given the cultural setting of NASA within which it took place. The tragic decision to launch, she contends, was not the result of anyone violating any rules. To the contrary, it was the result of rule-abiding behavior. People were simply doing their jobs as they were supposed to. It was conformity, not deviance, that led to the decision. Therefore, she concludes, managers were …