Calculation and Computation
CALCULATION AND COMPUTATION.
Words containing the roots calcul- and comput- have existed since antiquity. The study of concepts used to indicate actions, professions, and (mental and material) artifacts suggests that calculation and computation have not been, as canonically assumed, an exclusive concern of modern times. The mere existence of both word clusters throughout the decades (and centuries) prior to World War II also suggests that it may be problematic to assume that the relationship between calculation and computation has been simple—that is, computation existing as an exclusive postwar phenomenon brought about by a technical revolution that left calculation behind. For both Charles Steinmetz (1865–1923) and Vannevar Bush (1890–1974), celebrated pioneers of the prewar and interwar generation of electrical engineering, respectively, both calculation and computation were of constitutional importance to all their technical work. Their writings indicate a belief that the computing revolution started long before the 1940s. Steinmetz and Bush employed both concepts contemporaneously in their pervasive engineering textbooks in order to differentiate between high-skill analysis and low-skill application, between creative mental design and routine manual implementation, between that which was subject to the least and to the most of mechanization.
Steinmetz and Bush perceived themselves as analysts, in the tradition of Gottfried Wilhelm von Leibnitz (1646–1716) and Isaac Newton (1642–1727), the early modern founders of calculus. With the dynamic expansion of the division of labor that has been part and parcel of the expanding capitalist mode of production, the progress of calculation was a prerequisite for the advance of computation. Successful calculation from the top by the well-paid few would, first, routinize the work performed by the multitudes in the base of the pyramid; second, it would minimize the skill required by them; third, it would subject their work to mechanization; and fourth, it would lower their salary expectations. Computation was the job for a low-paid clerk known as the "computer" or "computor." These human computers were usually women, who produced computations for the state and the military, for insurance companies and other businesses, and for analysts within the engineering and scientific community. While human computers worked with a rich variety of artifacts, it was the mass employment of mechanical desktop "calculating machines" that determined their history. By contrast, the engineering graduate, almost exclusively a male, was trying to distance himself from the ranks of the human computers by passionately defending the accuracy of his inexpensive slide rule, which he could own individually and skillfully use to "calculate."
After the 1950s, amid popular expectation that full mechanization had finally arrived, the concept of "computer" connoted a machine rather than a human, thereby signifying the ideological hegemony that pointed to full separation of production from skilled labor, of accumulated (or dead) labor from living labor, and of fixed from variable capital. Instead of disappearing, the dated engineering differentiation between "analysts" and "computors" resurfaced in the system analysts versus coders struggle that marked the emergence of the programmers, computation's new professionals. The difference between computation and calculation resonates throughout the fierce competition between the digital and analog computer (1940s–1950s), followed by the juxtaposition of digital computer hardware to software (1960s–1970s), which, in turn, was succeeded by the contrast between digital computer software operating systems from software for special purposes (i.e., application software) in the 1980s and 1990s.
Premodern, Early Modern, Non-Western
The difference in the meaning of "calculation" and "computation" that are found in Steinmetz and Bush seems to have been built on the precapitalist use of terms formed by the roots calcul -and comput -, respectively. Historiographically, the late medieval period offers an example when one compares the interest shown in quantifying the heterogeneity of motion by the Merton College theorists known as the "Oxford calculatores" to the practice shown by the ecclesiastical community toward homogenizing the standardization of time through a technique known as computus. During the late medieval and early modern period, the attack by the algorists upon the abacists—that is, the promoters of computing who relied upon previous and private memorization of tables of numerical relationships versus the defenders of the ancient tradition who placed emphasis on live calculations performed in public by moving pebbles (that is, calculi) along designated lines—compares favorably to the conflict between the digital (programming) and the analog (living labor) of the recent decades.
The physical embodiment of complex numerical relationships through interconnected mechanical parts that were concealed by a case, hence masking the motion of the gears, displayed only the input and output numbers. Some celebrated early modern examples were Blaise Pascal's (1623–1662) adding machine (1642) and Leibniz's adding and multiplication machine (1685). Earlier in the same century (1623), the German mathematician and linguist Wilhelm Schickard (1592–1635) had mechanized the set of numbered sliding rods that John Napier (1550–1617) had devised in 1617 to simplify astronomical calculations. Both Pascal and Leibniz sought to profit selling their machines to merchants and natural philosophers. Galileo Galilei (1564–1642) tried the same with his improved computing dividers. Many of the early modern natural philosophers were heavily involved in calculating innovations. Additional contributions include "calculation by analysis" to the coordinates of René Descartes (1596–1650), by differentiation and integration in the calculus tradition that prefigures in Simon Stevin (1548–1620) and materializes in Newton and Leibniz, and by the analysis of multiplication and division into addition and subtraction through the logarithms that Napier introduced in 1614.
As with the method of algorists, the speed in calculation by logarithms assumed the availability of relevant tables. The transformation of these tables into scales inscribed in, first, circles sharing a fixed center and, soon after, scales that slid beside each other while sharing a fixed framework, found its ultimate presentation in the logarithmic slide rule, configured by William Oughtred (1574–1660) as early as 1621. The interactive proliferation of both tables of logarithms and logarithmic slide rules determined the history of calculation from the early modern period until the very recent decades. The dilemmas of computation in the recent decades have prefigured in the construction and worldwide use of tens (if not hundreds) of millions of slide rules, linear, circular, cylindrical, or hybrid, wooden, metal, or plastic, handmade or mass produced, cheap or expensive, with accessories such as cursors and magnifying glasses to increase the accuracy without increasing the size. The recent debates over special versus general purpose software and over competing software operating systems was long rehearsed in debates over choice of general or special purpose slide rule scales and scale system standards. Moreover, as scales of all sorts slid beside each other or within each other, the logarithmic slide rule turned out to be only one of innumerable versions of slide rules. Material culture scholars see no end to collection of calculation wheels devised to compute phenomena ranging from a menstrual cycle to a baseball season.
The coevolution of logarithmic slide rules and tables of logarithms is no different from the codevelopment of Leibniz's "calculus" and his "calculating machine." Taken together they point to the paired constitution of the scientific and the technical since earlier modernity. If Leibniz sought to sell his machine to merchants, Charles Babbage (1792–1871), a Cambridge mathematics professor, was interested in a calculating engine organized internally according to the symbolic efficiency and rationality of the nascent industrial order emerging within the first half of the nineteenth century in Great Britain. Partially funded by the government, Babbage failed for economic reasons to have his "difference engine" constructed because he was forced to depend on one of the most skillful workers of the period, Joseph Clement (1779–1844), in the workshop of whom the engine to calculate as if powered by "steam" was to be constructed. While the construction of the difference engine encountered the problem of skilled labor, Babbage realized that the use of the engine itself would be limited to special purposes, which would not eliminate calculation's dependence on skilled labor. In 1833, he started drafting sketches of a second engine, the "analytical engine," which he sought to be independent of labor skill. He kept making modifications to the plans until his death in 1871 without ever managing to advance beyond programmatic descriptions of such an engine.
In Babbage's The Exposition of 1851; or, Views of Industry, the Science and the Government of England, published in London in 1851 following his review of the panorama of industrial capitalism staged at the Crystal Palace World Fair, he repeatedly touched on the ideal of an engine that would render laborers mere "attendants," unable to influence the production of calculation. Although Babbage's calculating engines matched his promotion of a version of continental analysis that he thought to be more appropriate to a calculation to sustain further industrial growth, he became known for his emphasis on the mechanization rather than the organization of work. The organization of work has been the pursuit of the division-of-labor scheme that the French engineer Gaspard Clair Prony (1755–1839) devised in 1792 to have a new set of logarithmic and trigonometric tables produced as a monument to the new French Republic. Six eminent mathematicians who selected the appropriate equations formed the peak of his pyramid, a layer of ten mathematicians below them advanced the analysis to the point where everything was converted to simple arithmetic, and a group of one hundred humans, recruited even from hairdressers, performed simple operations on a part of a problem before passing it to the next person.
The expansive reproduction of this scheme by setting smaller or larger groups during the nineteenth and the twentieth century, after taking advantage of sources available both within and outside the Western society, resulted in the formation of an army of "computers." By the introduction of commercial calculating machines such as Thomas de Colmar's (1785–1870) Arithmometer, exhibited at the 1855 Paris Exposition, references to "human computers" are found from as near as a British male scientist's environment of female friends and as far as male Indians working in British engineering initiatives in their country. The relative continuity between Leibniz's calculating machine and the Arithmometer—the most widely available calculating machine based on improvements on Leibniz's machine by 1871—is an index of the relative continuity between merchant and industrial capitalism. Leibniz had promoted his machine to a society marked by the dynamic appearance of merchants by stating that "it is unworthy of excellent men to lose hours like slaves in the labor of calculation, which could safely be relegated to anyone else if the machine were used" (Leibniz, quoted in David Eugene Smith, A Source Book in Mathematics [1929], pp. 180–181). Babbage's "attendants" were the industrial version of Leibniz's slaves. Babbage wanted his calculating engines used for the production of general and special purpose tables, including tables for the navigational pursuits of the British empire. Batteries of human computers—along with other implements such as calculating machines and slide rules—produced numerous tables and charts for an assortment of military and civilian purposes (scientific, engineering, and commercial).
The lack of a synthesis of scholarly studies limits scholars' ability to compare the modern European experience with calculation and computation to those of non-European societies. What is known about the Inca knotted strings known as quipuand the Chinese knotted cords suggests that societies paradigmatic of civilization in other continents relied heavily on what is now identified as calculation and computation. Ongoing historical interpretation of archaeological findings from ancient Mesopotamia point to the few who knew how to connect computations routinely produced by the many to a calculating coefficient that could make the abstract concrete. Projecting Western conceptual demarcations to non-Western societies may prove problematic, especially considering the parallel dead ends from having projected late modern Western demarcations to ancient and early modern histories. Interpretation, for example, of the Hellenistic Antikythera mechanism as "analog," and accordingly, a technically inferior computer, has blocked historians from taking into account the digitalization introduced by the complicated geared structure underneath the disk representing analogically the universe. As a result, the search for how the technical accuracy of the artifact matched with social interests of the period has been replaced by the assumption of limits in the accuracy due to inferiority from belonging to an essentialist inaccurate technical genre. Similarly, a historiography aiming at interpreting the analog motion of the pebbles on ancient and early modern abacuses has been blocked by the late modern emphasis on the resting pebbles—that is, on the perception of pebbles as digits. The tradition of the Chinese and the Western abacus, therefore, has been flatly situated under the "digital," thereby making it impossible to acknowledge differences in the employment of abacus analogies between and within traditions.
Late Modern Period
Abacuses and artifacts from the associated tradition of counting boards with pebbles or beads share the honor of being ancestors of the digital computer along with the various mechanical and electromechanical desktop calculating machines produced from 1850 to 1950. They are joined by an ensemble of mechanical and (later) electromechanical tools and machines used for punching holes into cards, which represent computable variables, for sorting these cards according to the variable to be computed, and for tabulating and printing the results. They are best known as punched card machines, based on the part of the process that was least subjected to mechanization. Going back to the end of the nineteenth century, punched card machines were rented by companies that were ancestors to IBM (International Business Machines) to, first, the United States Census Bureau, and, subsequently, censuses for nations around the world. Calculating and punched card machines were extensively used for filing, accounting, and related activities that involved the processing of large amounts of data in larger enterprises, which have ranged from railroads to insurance companies. The U.S. Social Security Administration also used hundreds of punched card mechanisms and machines to implement a social security system that, in 1935, had to handle information about the wages paid by three million employers to their twenty-six million employees.
In addition to the slide rule, the list of what has been a posteriori placed under analog computers includes calendars, sundials, orreries, astrolabes, planetariums, material models of all kinds (including scale models), mathematical and other mental models, graphs that could be as complex as the nomograms of Maurice D' Ocagne (1862–1938) and his followers (used from the late nineteenth century until the recent decades), computing linkages, artifacts with mechanical integrators and differentiators, curve tracers and kinematic mechanisms in the tradition of planimeters and associated artifacts, harmonic analyzers and synthesizers like the one that Lord Kelvin had built as a tide predictor, mechanical, electromechanical, and electrical analyzers for general (e.g., Bush's differential analyzer) and special purposes (e.g., Bush's electric power network analyzers), electrolytic tanks, resistive papers and elastic membranes used as models, and countless mechanisms and machines produced and used in fire control (internal, external, and terminal ballistics). Case studies have retrieved the histories of many other cases of unique tools and machines, including those used for crucial tidal calculations in the Netherlands.
Changes in calculating machinery were coupled with changes to make the calculus correspondingly operational. Remembered more as the author of the state-of-the-art calculating machines during the interwar period, Bush was also the author of influential writings on the "operational calculus," which capitalized on a tradition of modifications of the calculus that adjusted it to the ever-changing needs of engineers who thought of themselves as equal to any scientist. In the 1930s and in the 1940s, punched card machines were reconfigured to be useful in scientific calculations. Bush's differential and network analyzers were also used in the scientific calculations of the period immediately before and after World War II. Calculating machines were used in scientific calculations even earlier and so were slide rules.
The various branches of the state have had an organic role in fostering technological change. To the examples of the extensive use of digital punched card machinery by civilian apparatuses such as the census should be added the involvement by the military in purchasing analog fire control mechanisms and machines; of the state's exclusive involvement in cryptography-related calculation and computation even less is known, with the exception of the celebrated Colossus machine during World War II. Using the Colossus machine, a British team that included in its members the mathematician Alan Turing (1912–1954) broke the code produced by a German machine known as the Enigma. The military had actually used hundreds of punched card machines in World War I for materiel inventory and for medical record keeping. Interwar support for military needs changed so remarkably that the computing bombsight and the anti-aircraft director were by World War II extremely complex artifacts with thousands of mechanical, electrical, and electronic parts. The competing development of the computing bombsight, which increased the target reach for a bomber, and the anti-aircraft director, which increased the reach of ground-based fire targeting the bomber itself, has formed a vicious circle that exemplifies the contradiction of modern technology. In the United States, the development of accurate bombsights during World War II was a secret second only to the construction of the atomic bomb. The extreme technological contradiction to date may have been that the most accurate computing bombsight was used to drop the first atomic bomb, a lethal weapon like none before it, the efficient release of which required the least accuracy.
Contemporary Period
The demand for ballistics tables beyond what available human computers could supply in World War II resulted in the construction of ENIAC (Electronic Numerical Integrator and Computer), considered by many to have been the first digital electronic computer. It was constructed by employing 18,000 vacuum tubes between 1943 and 1945 at the Moore School of Pennsylvania by a team led by John Mauchly (1907–1980) and J. Presper Eckert (1919–1995) with funds from the U.S. Army's Ballistics Research Laboratory. Under the direction of Howard Aiken (1900–1973), a physics instructor at Harvard University, IBM completed in 1944 the construction of the Automatic Sequence Controlled Calculator (Harvard Mark I), used to produce the U.S. Navy's ballistic tables. A wealth of detail exists about the 1940s and the 1950s, the period of the heroes of electronic computation, which has attracted the disproportionate attention of historians. At the very least, this attention has made it clear that the electronic computer was a "multiple invention," which means that it came by a social cause, not an individual genius.
The construction of ENIAC was roughly contemporaneous to several other projects. Aiken's efforts at reconfiguring IBM's accounting machines for the purpose of solving nonlinear differential equations went back to 1937. The Bell Labs employee George Stibitz constructed a series of machines based on relays and other telephonic equipment, including the Complex Number Calculator, which was used successfully by the military between 1940 and 1949. In Germany, starting in 1938 and continuing through the war, Conrad Zuse (1910–1995) constructed a series of machines that were also based on electromechanical relays. Iowa State College physics professor John Atanasoff (1903–1995) and his graduate student Clifford Berry (1918–1963) built a special purpose electronic computer, the ABC (Atanasoff-Berry Computer) between 1939 and 1942. Ranging from the big to the gigantic, these machines inaugurated experimentation with correspondence between electronic circuitry and numerical computational relationships, in the decimal or the binary system. Claude Shannon is the most recognizable of those who were arguing for such correspondence theoretically by showing that the design of electronic circuits and the reduction of reasoning to a series of simple algebraic operations (as proposed by the Boolean algebra) could be used to push each other forward.
Having been a participant of the Moore School team, the Princeton mathematician John von Neumann (1903–1957) shaped the following generation of electronic computers by setting the standard of a computer architecture based on an internal division of labor between a control unit that interacted with the memory to check the flow of data between the memory and the arithmetic unit while controlling the input and output. His division between an arithmetic unit, which is where any future calculation was to take place (by taking into account selected past computations from a stock accumulated in the memory) imported the living-dead labor dynamic balance of the whole of the capitalist economy into the workings of the machine—the balance during the dynamic self-accumulation of past computations in the memory was to be provided by the control unit. The accumulation of data and instructions in the memory unit became known as the "stored program technique." It represented an economy of flexible allocation of resources that was opposite to the brute-force approach of the previous generation of electronic computers. Von Neumann presented his architecture in a 1945 report on EDVAC, a computer to follow ENIAC. The architecture was rehearsed in the construction of the IAS (named after Princeton's Institute for Advanced Study) by a team led by von Neumann at Princeton. It was completed in 1952. Like most experimental computers of the period, IAS was funded by sources such as the military and the Atomic Energy Commission. Similar machines were constructed at seventeen other research centers in the United States and several more in various other countries.
Interested in commercial rather than scientific computers, Eckert and Mauchly tried a series of intermediate computer configurations and business schemes before authorizing the production of the Remington-Rand UNIVAC (Universal Automatic Computer), the first of which was delivered to the Census Bureau in 1951. The first UNIVAC for business use was installed at the General Electric Appliance Division, for payroll computations in 1954. By 1957, the number of UNIVACs sold was up to forty-six. Along with several other manufacturers, IBM entered the electronic computer business in the early 1950s. It started with the 1951 IAS-type Defense Calculator, which was renamed IBM 701. IBM constructed and rented nineteen such machines. By 1960, IBM had dominated the market with machines such as the IBM 650, usually rented to universities under attractive terms, which subsequently tied university computations to IBM. IBM's dominance was solidified by its introduction in the mid-1960s of the standard-setting System/360 family of computers.
By then, the analog-digital debate, which has started in the late 1940s and escalated in the early 1950s, was practically over. The evolution of the MIT Project Whirlwind between 1945 and 1953, under the direction of Jay Forrester, captured the emergence of the analog-digital demarcation. Intended to be used in real-time aircraft flight simulation, it started as an analog machine. Upon learning about the EDVAC, Forrester decided to attempt to construct a digital computer. His costly change was supported, initially, by the U.S. Office of Naval Research with approximately one million dollars per year. When the Navy gave up, the Air Force stepped into the void hoping that the digital Whirlwind computer could lead into a machine suitable to the needs of SAGE (Semi-Automatic Ground Environment), a system to coordinate the detection of and response to the Soviet Union's strategic bombers. The pursuit of SAGE brought about enormous demands for programming, thereby revealing the dependence on computer software. It started to become apparent that the analog-digital contrast was succeeded by a contrast between software and hardware.
There is no record of thinkers who foresaw a market for more than a few mainframes in the 1940s. There is also no record of thinkers who predicted that the future of computation was not in the formation of computer utilities, according to the direction suggested by the time-sharing of mainframes during the 1960s. Patented in 1959 by Jack Kilby of Texas Instruments, the integrated circuit contained all the elements of an electronic circuit in a chip of silicon. The microprocessor appeared a decade later as a general-purpose programmable integrated circuit. The cheapening of the hardware and the miniaturization of electronic components made possible the decrease of the size of the mainframes to that of minicomputers. The potential of reducing the size of the computer further to that of a home appliance was realized in the subsequent decades, resulting in the mass production and use of personal computers during the 1980s and the mass interconnection of personal computers that led to the formation of the Internet and the World Wide Web during the 1990s. In the meantime, microprocessors have been installed everywhere from home appliances to automobiles.
The decrease of the value of hardware accentuated the increase in the value of software. What concluded as a "software crisis" started as a problem of "programmer shortage." Generations of general and special-purpose programming languages, and, by now, software operating systems have yet to provide a stable solution. Attempts at computer-aided programming (machine self-programming) and software engineering (mass production of software) have met with limited success if not complete failure. From eliminating the "programming bugs" that clogged the early electronic computers to blocking the so-called spam that plugs contemporary e-mailing, computation seemed to have increased rather than decreased the dependence on skilled labor. In its absence, all sorts of computer viruses threaten the contemporary world with catastrophe. Many of the world's habitants anxiously anticipated living through the completion of a millennium in the transition from year 1999 to year 2000, which became known as Y2K. Their recollection of the event is marked by the memory of anxiety surrounding Y2K, a concern that stemmed from decades of labor-saving yet short-sighted use of two digits for the purpose of making electronic computations economical.
See also Computer Science ; Mathematics ; Technology .
bibliography
Abbate, Janet. Inventing the Internet. Cambridge, Mass.: MIT Press, 1999.
Aspray, William. ed. Computing before Computers. Ames: Iowa State University Press, 1990.
Beniger, James, R. The Control Revolution: Technological and Economic Origins of the Information Society. Cambridge, Mass.: Harvard University Press, 1986.
Black, Edwin. IBM and the Holocaust: The Strategic Alliance between Nazi Germany and America's Most Powerful Corporation. New York: Crown, 2001.
Blok, Aad, and Greg Downey, eds. Uncovering Labour in Information Revolutions, 1750–2000. Cambridge, U.K., and New York: Cambridge University Press, 2003.
Borst, Arno. The Ordering of Time: From the Ancient Computus to the Modern Computer. Translated by Andrew Winnard. Chicago: University of Chicago Press, 1994.
Campbell-Kelly, Martin. From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry. Cambridge, Mass.: MIT Press, 2003.
Campbell-Kelly, Martin, et al. The History of Mathematical Tables: From Sumer to Spreadsheets. Oxford and New York: Oxford University Press, 2003.
Campbell-Kelly, Martin, and William Aspray. Computer: A History of the Information Machine. New York: Basic Books, 1996.
Ceruzzi, Paul. A History of Modern Computing. 2nd ed. London and Cambridge, Mass.: MIT Press, 2003.
Cortada, James. IBM, NCR, Burroughs, and Remington Rand and the Industry They Created, 1865–1956. Princeton, N.J.: Princeton University Press, 1993.
Edwards, Paul. The Closed World: Computer and the Politics of Discourse in Cold War America. Cambridge, Mass.: MIT Press, 1996.
Hopp, Peter M. Slide Rules: Their History, Models and Makers. Mendham, N.J.: Astragal Press, 1999.
Jezierski, Dieter von. Slide Rule: A Journey through the Centuries. Translated by Rodger Shepherd. Mendham, N.J.: Astragal Press, 2000.
Kline, Ronald. Steinmetz: Engineer and Socialist. Baltimore: John Hopkins University Press, 1992.
Lubar, Steven. InfoCulture: The Smithsonian Book of Information Age Inventions. Boston, Mass.: Houghton Mifflin, 1993.
MacKenzie, Donald. Knowing Machines: Essays on Technical Change. Cambridge, Mass.: MIT Press, 1996.
McFarland, Stephen L. America's Pursuit of Precision Bombing, 1910–1945. Washington, D.C.: Smithsonian Institution Press, 1995.
Menninger, Karl. Number Words and Number Symbols: A Cultural History of Numbers. Translated by Paul Bronner. Cambridge, Mass.: MIT Press, 1969.
Mindell, David A. Between Human and Machine: Feedback, Control, and Computing Before Cybernetics. Baltimore: Johns Hopkins University Press, 2002.
Nebeker, Frederik. Calculating the Weather: Metrology in the 20th Century. San Diego, Calif.: Academic Press, 1995.
Small, James S. The Analogue Alternative: The Electronic Analogue Computer in Britain and the USA, 1930–1975. London and New York: Routledge, 2001.
Spufford, Francis, and Jenny Uglow, eds. Cultural Babbage: Technology, Time and Invention. London: Faber, 1996.
Williams, Michael R. A History of Computing Technology. 2nd ed. Los Alamitos, Calif.: IEEE Computer Society, 1997.
Yates, JoAnne. Control Through Communication: The Rise of American System in Management. Baltimore: Johns Hopkins University Press, 1989.
Zachary, G. Pascal. Endless Frontier: Vannevar Bush, Engineer of the American Century. New York: Free Press, 1997.
Aristotle Tympas