Autonomous Technology

views updated

AUTONOMOUS TECHNOLOGY

The term autonomous technology is associated with arguments that modern technology has grown out of control or develops independent of any particular human intention or plan. It is usually used to highlight undesirable aspects of technological society undermine human autonomy, thus signaling its ethical relevance. The clear ethical connotation of autonomous technology marks its difference from the notion of technological determinism, with which it is often associated.

Challenging the taken-for-granted notion of technology as simply an instrument or a tool, as well as the belief in human freedom, the concept of autonomous technology has been at the center of various controversies in the philosophy of technology, where it has functioned in three related contexts. First, it has served to articulate an uneasy feeling that has accompanied the mastery of nature and the fast pace of technological change since the Industrial Revolution. As early as the nineteenth century, stories were written about human beings being ruled by "their" mechanical creatures, which had gained autonomy. Mary Shelley's famous novel Frankenstein (1994 [1818]) is the best-known example. Second, the concept has been associated with those philosophers who stressed the alienating and dehumanizing aspects of modern technology. Examples include Martin Heidegger (1889–1976), Herbert Marcuse (1898–1979), and Lewis Mumford (1895–1990). Finally, third, are those who have popularized the term and made it a central theme in their analyses of technology. Here the natural reference is to Jacques Ellul and Langdon Winner.


Theories of Autonomous Technology

Ellul (1954) presents characteristics of modern technology such as automatism, self-augmentation, universalism, and autonomy—the last of which summarizes the rest. Ellul claims that modern technology, unlike traditional technology, is not bound by any heteronomous rules or principles, but develops according to its own rules. As its scale and pervasiveness increase, the development of technology (Ellul's term is la technique) is influenced neither by sociopolitical and economic changes, nor moral and spiritual values. Rather, technological change itself now defines the context of other aspects of culture such as capitalist competition for survival in the market. The pursuit of human well-being, presumably the purpose of technological development, is replaced by obsessive pursuit of efficiency, even though the exact meaning of efficiency is often unclear. Technological progress is assumed to be always beneficial, while dimensions of sacredness, mystery, and morality are minimized. Autonomous technology reaches fulfillment when people no longer feel uneasy about "mastering nature" that has come to contradict their own human autonomy.

Winner (1977) claims that autonomous technology is revealed most clearly in technological politics. Examples include the political imperative to promote technology, because problems from one technology require another to address it, and the phenomenon of reverse adaptation, in which an end is modified so that it fits the available means. Showing that technological artifacts have political implications (Winner 1980), Winner argues that modern technology should be perceived as legislation that shapes "the basic pattern and content of human activity in our time" (Winner 1977, p. 323) and as forms of life, which have become part of our humanity (Winner 1986). The dilemma of technological society is that decisions on technology are often necessitated by existing technologies (the technological imperative); examples include the nuclear power plant and nuclear waste storage. Furthermore, sometimes, the ends and means of technological enterprises are reversed (reverse adaptation), as one can see in the development of space projects. In this respect, Winner agrees with Ellul that "if one looks at the broader picture of how technique is welcome and incorporated into society, one can hardly be confident that the origins, activities, and results of social choice about technology are firmly in anyone's grasp at all" (Winner 1995, p. 67).

Nevertheless, while appreciating Ellul's analysis, Winner eventually criticizes Ellul for ignoring human agency in his conception of autonomous technology. For Winner, it was humans that have let modern technology grow out of control, by mistakenly ignoring its political dimensions. He argues that although technology is out of control or drifting without fixed direction, it is not fully self-determining, with a life of its own. Technology is only semiautonomous. Thus, the issue raised by autonomous technology is "what humanity as a whole will make of them" (Winner 1995, p. 71).



Criticism and Response

Concepts of autonomous technology have been subject to various criticisms and misunderstandings. First, autonomous technology is often accused of reflecting irrational technophobia. This view relies on the simple assumption that technology is a neutral instrument, and as such under full human control. Accordingly, autonomous technology is regarded as a self-contradicting term.

A second objection is that the history of technology shows that technological development is not autonomous. Social constructivists argue that technological developments are contingent, because they are shaped by various sociopolitical and economic influences. A famous example is how the bicycle came to have its current design (Pinch and Bijker 1987). In the nineteenth century, there was another competing design with a large front wheel. As time went by, the current design became the standard model, not because of any internal drive for efficiency but simply because people began to perceive the bicycle as a means of transportation rather than as something used for sport. Based on this thesis, some social constructivists have developed theories of public participation in technological decision making processes (Feenberg 1999, Bijker 1995).

A more serious challenge to autonomous technology is that the idea leads to technological determinism and pessimism. Technological determinism claims that technological development has a unilateral influence on all aspects of human life and follows a fixed path according to its inner dynamics. Consequently, there cannot be any meaningful effort to avert the situation. The concept of autonomous technology is often considered the most straightforward and pessimistic version of technological determinism that denies any hope for a better future in the technological society.

However, the idea of autonomous technology rests on an understanding of technology that is often overlooked by such criticisms. First, autonomous technology specifically refers to modern technology as opposed to traditional technology. Calling a hammer and a nuclear power plant "technology" in the same sense ignores technology as a modern experience. Second, the prime concern of autonomous technology is not individual technologies, such as the bicycle. For Ellul, technology (la technique) is the ensemble of individual technologies that compose a technological system. The particular development of the bicycle is thus irrelevant. Autonomous technology is not about the next step of individual technological development, but about the movement of the technological system at large, with its unintended socioeconomical, cultural, environmental, and political consequences. It is impossible for anyone to claim full control over technological change in this broad sense, which is always geared toward increased levels of technology or artifice in the human world.

When technology is viewed in this way, it is misleading to quickly identify autonomous technology and technological determinism. Autonomous technology does not claim that the evolution of individual technologies follows a fixed path, nor does it exclude possible sociopolitical interventions. On the contrary, Winner claims, "one can say that all technologies are socially produced and that technical devices reflect a broad range of social needs" (Winner 1995, p. 70). As aforementioned, the concept of autonomous technology should be seen in the broader context of technological society. Technological evolution would function like biological evolution, on its own terms but not in a wholly deterministic manner. Autonomous technology certainly allows superficial variances in technical processes, caused by sociocultural and economic factors, but the efficiency principle remains the driving force directing the all-embracing comprehensive technological enterprise, which human beings are not able to alter or stop. Carl Mitcham (1994) distinguishes Ellul's theory as a form of qualified determinism, contrasted with naive determinism.


Autonomous Technology and Human Freedom

Hence, the way in which autonomous technology undermines human autonomy is subtle and indirect. People can freely choose whether they will use this or that computer program, for example, but the decision is made based upon the belief in the inevitability of progress in computer technology, which no one can alter. The conviction that technological progress is inevitable and beneficial is the basis of virtually every political agenda and education system around the globe.

Is an escape possible? Does autonomous technology encourage pessimism by denying human freedom? It is undeniable that this concept is discouraging in the sense that it does not leave much room for a bright future or positive action toward change. Nevertheless, it is important to remember that this concept is proposed in the context of a social critique of the contemporary technological society, rather than being part of theoretical and neutral reflection on technology. Therefore, it is misleading to focus on whether technology is autonomous or not "by nature." The argument for autonomous technology remains strong, as long as people allow technology to increasingly dominate all aspects of their lives without any critical reflection.

Ellul (1988, 1989) sees little hope for reverting the movement of autonomous technology. He argues that the only chance—the only freedom—left for a human being in the face of autonomous technology is to acknowledge one's non-freedom and to practice an ethics of non-power, namely, deciding not to do everything one can do with technology. Because Winner (1977) views technnology as a political phenomenon, he denies the absoluteness of autonomous technology; he proposes new technological forms that can accommodate more public participation and flexibility, thus allowing the possibility of political intervention in the process of technological development. This suggestion was further developed in Richard E. Sclove's "design criteria for democratic technologies" (Sclove 1995). Winner (1977) says that autonomous technology is the question of human autonomy reiterated. This remark succinctly expresses the main concern of the concept, because, paradoxically enough, different theories of autonomous technology all emphasize the importance of human autonomy, whether they are encouraging or discouraging concerning the future of technological society.


WHA-CHUL SON

SEE ALSO Artificial Intelligence;Automation;Autonomy;Critical Social Theory;Determinism;Ellul, Jacques;Frankenstein;Freedom.

BIBLIOGRAPHY

Bijker, Wiebe E. (1995). Of Bicycles, Bakelites and Bulbs: Toward a Theory of Sociotechnical Change. Cambridge, MA: MIT Press.

Ellul, Jacques. (1954). La Technique ou l'Enjue du Siècle [The technological society]. Paris: Armand Colin. This work is the main reference to the notion of autonomous technology.

Ellul, Jacques. (1988). Le bluff technologique [The technological bluff]. Paris: Hachette. Ellul argues that discourses on technique since the 1980s make people feel comfortable with dehumanizing technique, completing the autonomy of technique. He also talks about the paradox of non-freedom as the only freedom left for people in technological society.

Ellul, Jacques. (1989). "The Search for Ethics in a Technicist Society." Research in Philosophy and Technology 9: 23–36.

Feenberg, Andrew. (1999). Questioning Technology. London: Routledge.

Micham, Carl. (1994). Thinking through Technology: The Path between Engineering and Philosophy. Chicago: University of Chicago Press.

Pinch, Trevor J., and Wiebe E. Bijker. (1987). "The Social Construction of Facts and Artifacts: Or How the Sociology of Science and the Sociology of Technology Might Benefit Each Other." In The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology, ed. Wiebe E. Bijker, Thomas P. Hughes, and Trevor J. Pinch. Cambridge, MA: MIT Press.

Sclove, Richard E. (1995). Democracy and Technology. New York: Guilford Press.

Shelley, Mary. (1994 [1818]). Frankenstein, or the Modern Prometheus, the 1818 Text, ed. Marilyn Butler. Oxford: Oxford University Press.

Winner, Langdon. (1977). Autonomous Technology: Technics-Out-of-Control as a Theme in Political Thought. Cambridge, MA: MIT Press. Winner confronts with different versions of autonomous technology theory, while developing his own.

Winner, Langdon (1980). "Do Artifacts Have Politics?" Daedalus 109(1): 121–136. This is one of the most frequently cited work in philosophy of technology concerning the political aspect of technology.

Winner, Langdon. (1986). The Whale and the Reactor: A Search for Limits in an Age of High Technology. Chicago: University of Chicago Press.

Winner, Langdon. (1995). "The Enduring Dilemmas of Autonomous Technique." Bulletin of Science, Technology & Society 15(2): 67–72. The notion of autonomous technology is defended mainly against social constructivism.

More From encyclopedia.com