Calculator
Calculator
Electronic predecessor to computer
A calculator is a computing machine whose purpose is to do mathematical computations under direct human guidance. The simplest calculators perform the basic mathematical functions (addition, subtraction, division, and multiplication), while the more advanced ones, which are relatively new in the history of computing machines, do more complex calculations such as statistics, trigonometric functions, integration, differentiation, graphing, solving polynomials, and more. The term “computer” is, today, generally reserved for a more complex, multipurpose device than the calculator; until the 1940s, a “computer” was a human being who performed calculations, possibly with the assistance of a mechanical or electromechanical calculator or adding machine.
The first calculators
Perhaps the earliest calculating machine was the Babylonian rod numerals. Administrators carried rods of bamboo, ivory, or iron in bags to help with their calculations. Rod numerals used nine digits (Figure 1).
The next invention in counting machines was the abacus. A counting board dating from approximately 500 BC now resides in the National Museum in Athens. It is not an abacus, but the precursor of the abacus. Oriental cultures have documents discussing the abacus (the Chinese call it a suan phan and the Japanese the soroban ) as early as the 1500s; however, they were using the abacus at least a thousand years earlier. While experts disagree on the origin of the name (from either the Semitic abq, or dust, or the Greek abax, or sand tray), they do agree the word is based on the idea of a sand tray which was used for counting.
A simple abacus has rows (or wires) and each row has 10 beads. Each row represents a unit 10 greater than the previous. Thus, the first stands for units of one; the second, units of 10; the third, units of 100; and so on. The appropriate number of beads are moved from left to right on the wire representing the unit. When all 10 beads on a row have been moved, they are returned to their original place and one bead on the next row is moved. The soroban divides the wires into two unequal parts. The beads along the lower, or larger, part represent units, tens, hundreds, and so on. The bead at the top represent five, 50, 500, and so on. These beads are stored away from the central divider, and as needed are moved toward it.
Finger reckoning must not be ignored as a basic calculator. Nicolaus Rhabda of Smyrna and the Venerable Bede (both of the eighth century AD wrote, in detail, how this system using both hands could represent numbers up to one million. The numbers from one to 99 are created by the left and the numbers from 100 to 9,900 by the right hand. By bending the fingers at their various joints and using the index finger and thumb to represent the multiples of 10, combinations of numbers can be represented. Similar systems were devised much earlier than the eighth century, probably by merchants and traders who could not speak each other’s languages but needed a system to communicate: their fingers. Multiplication using the fingers came much later. Well into the fifteenth century AD, such complex issues as multiplication were left to university students, who were forced to learn a different finger reckoning system to accommodate multiplication.
Early calculators
Schickard, a German professor and Protestant minister, seems to have been the first to create an adding machine in the 1620s. It performed addition, subtraction, and carrying through the use of gears and preset multiplication tables. The machine only computed numbers up to six digits. Once the operator surpassed this limit, he was required to put a brass ring around his finger to remind him how many carries he had done. Schickard made sure to include a bell, so that the user would not forget to add his rings. His drawings of the machine were lost until the mid-1960s when a scrap of paper was located inside a friend’s book which had a drawing of his machine. With this drawing and Schickard’s letters, a reconstruction was made in 1971 to honor the adding machine’s 350th anniversary.
Finished in 1642, Blaise Pascal’s calculating machine also automatically carried tens and was limited to six digit numbers. The digits from zero to nine were represented on dials; when a one was added to a nine the gear turned to show a zero and the next gear, representing the next higher tens unit, automatically turned. Over 50 of these machines were made. A few remain in existence.
Gottfried Wilhelm Leibniz, a German, created a machine in 1671 which did addition and multiplication. For over 200 years this machine was lost; then it was discovered by some workmen in the attic of one of the Göttingen University’s buildings.
Charles Xavier Thomas de Colmar, in 1820, devised a machine which added subtraction and division to a Leibniz type calculator. It was the first mass-produced calculator and became a common sight in business offices.
Difference engine
The great English mathematician Charles Babbage (1791–1871) tried to build a machine which would calculate mathematical tables to 26 significant figures; he called it the difference engine. However, in the early 1820s his plans were stalled when the British government pulled its funding. His second attempt, in the 1830s, failed because some of the tools he needed had yet to be invented. Despite this, Babbage did complete part of this second machine, called the analytical engine, in the early 1840s. It is considered to be the first modern calculating machine. The difference between the difference and analytical engines is that the first only performed a certain number of functions, which were built into the machine, while the second could be programmed to solve almost any algebraic equation.
By 1840 the first difference engine was finally built by the Swedish father and son team of George and Edvard Scheutz. They based their machine on Babbage’s 1834 publication about his experiments. The Scheutz’s three machines produced the first automatically created calculation tables. In one 80-hour experiment, the Scheutz’s machine produced the logarithms of one to 10,000; this included time to reset the machine for the 20 polynomials needed to do the calculations.
Patents
The first patent for a calculating machine was granted to the American Frank Stephen Baldwin in 1875. Baldwin’s machine did all four basic mathematical functions and did not need to be reset after each computation. The second patent was given in 1878 to Willgodt Theophile Odhner from Sweden for a machine of similar design to Baldwin’s. The modern electronic calculators are based on Baldwin’s design.
In 1910, Babbage’s son, Henry P. Babbage, built the first hand held printing calculator based on his father’s Analytical Engine. With this machine, he was able to calculate and then print multiples of π to 29 decimal places.
In 1936, a German student, Konrad Zuse, built the first automatic calculating machine. Without any knowledge of previous calculating machines, Zuse built the Z1 in his parents’ living room. He theorized that the machine had to be able to do the mathematical fundamentals. To do this, he turned to binary mathematics, something no other scientist or mathematician had contemplated. (The mathematics we use in every day life, decimal, is based on 10 digits. Binary mathematics uses two digits: 0 and 1.) By using binary mathematics, the calculating machine became a series of switches rather than gears, because a switch has two options: on (closed or 1) or off (open or 0). He then connected these switches into logic gates, a combination of which can be selected to do addition or subtraction. Zuse’s third model (finished in 1941) was not only programmable, but hand held; it added, subtracted, multiplied, divided, discovered square roots, and converted from decimal to binary and back again. However, to add it took a third of a second, and to subtract an additional three to five seconds. By changing his relays into vacuum tubes he believed he could speed his machine up 1,000 times, but he could not get the funding from the Third Reich government to rebuild his machine.
Electronic predecessor to computer
In 1938, the International Business Machine Corporation (IBM) team of George Stibitz and S. B. Williams began building the Complex Number Calculator. It could add, subtract, multiply, and divide complex numbers. They completed the project in 1940, and until 1949 these calculators were used by Bell Laboratories. It was the first machine to use remote stations (terminals, or input units, not next to the computer) and to allow more than one terminal to be used. The operator typed the request onto a teletype machine and the response was sent back to that teletype. The relays inside the machine were basic telephone relays.
The IBM Automatic Sequence Controlled Calculator (ASCC) was based upon Babbage’s ideas. This machine, completed in 1944, was built for the United States Navy by Harvard University and IBM. It weighed about 5 tons (10,000 kg) and was 51 ft (15.5 m) long and 8 ft (2.4 m) high. A second, more useful version, was completed in 1948.
The Electronic Numerical Integrator and Computer (ENIAC), also based on Babbage’s concepts, was completed in 1945 at the University of Pennsylvania for the United States Army. It weighted over 30 tons (27, 240kg) and filled a 30 by 55ft (9by 17m) room. In one second it could do 5,000 additions, 357 multiplications, or 38 divisions. However, to reprogram ENIAC meant rewiring it, which caused up to two days in delays.
The first electronic calculator was suggested by the Hungarian turned American John von Neumann. Von Neumann introduced the idea of a stored memory for a computer which allowed the program and the data to be inputted to the machine. This resolved problems with rewiring computers and permitted the computer to move directly from one calculation to another. This type of machine architecture is called “von Neumann.” The first completed computer to use von Neumann architecture was the English Electronic Delay Storage Automatic Calculator (EDSAC) finished in 1949. An added feature of the EDSAC was that it could be programmed in a type of shorthand, which it then converted into binary code, rather than its precursors which demanded the programmer actually write the program in binary code, a laborious process.
Inside calculators
The early counting machines, items like the car’s odometer, work with a set of gears and wheel. A certain number of wheels are divided in 10 equal parts on each of which one of the 10 digits appears. (Windows are placed on top of these wheels so that only one digit appears at a time.) These wheels are then attached to gears which as they turn rotate the wheel so that the digit being displayed changes. When the right most wheel changes from 9 to 0, a mechanism is set in motion which turns the wheel to the left one unit, so that the digit it displays changes. This is the carry sequence. When this second wheel changes from 9 to 0, it too has a mechanism to carry to the next left wheel.
Electronic calculators have the four major units van Neumann created: input, processing, memory, and output. The input unit accepts the numbers keyed in, or sent through the reader in the case of the punch card, by the operator. The processing unit performs the calculations. When the processing unit encounters a complex calculation, it uses the memory unit to store intermediary results or to locate arithmetic instructions. At the completion of the calculation, the final answer is sent to the output unit which informs the operator of the result; this may be through a display, paper, or a combination. Since the calculator thinks in binary, the output unit must convert the result into decimal units.
Modern advances
At the end of 1947, the transistor was invented, eventually making the vacuum tube obsolete. This tiny creation, composed of semiconductors, was much faster and less energy consumptive than the tubes. Problems arose with the connections between the components with size, speed, and reliability as more complex machines needed more complex circuitry which in turn required more components soldered to more boards (the actual board to which the pieces were attached). The next breakthrough came with the invention of the integrated circuit (IC) in 1959 by Texas Instruments (TI) and Fairchild (a semiconductor manufacturing company). The integrated circuit is akin to a solid mass of transistors, resistors, and capacitors. Again, the speed of computation increased (since the resistance in the circuit was reduced), and the energy required by the machine was decreased. Finally, a computer could fit on a spaceship (they were part of the Apollo computer) or missile. In 1959, an IC cost over $1,000 but by 1965 they were under $10.
KEY TERMS
Binary— The base 2 system of counting using two digits: 0 and 1. Each unit is a 2 to the n+1 power. For example, the first unit is 2° or 1; the second 21 or 2; the third 22 or 4; the forth 23 or 8; and so forth. Thus 221 in binary is 11011101 or 1 hundreds twenty-eight (27), 1 sixty-four (26), no thirty-twos (25), 1 sixteen (24), 1 eight (23), 1 four (22), no twos (21), and 1 one (2°). This unit of counting was invented by Gottfried Leibniz in the 1600s.
Decimal— The base 10 system of counting using ten digits: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9. Each unit is 10 to the n+1 power. For example, the first unit is 10° or 1; the second 101 or 10; the third 102 or 100; and so forth. Thus, 221 is 2 hundreds (102), 2 tens (101), and 1 one (10°).
Punch cards— Made of a heavy cardboard, these rectangular cards have holes punched in them. Each hole is placed in a designated area which the computer then translates into a binary code. A series of such punched cards contain a sequence of events, or a program. The first punch cards were invented by Jacquard, a weaver, who wanted to automate the creation of patterns in his fabric. Thus, his loom was the first machine to use these cards.
Ted Hoff, an electrical engineer for Intel, conceived of a radical new concept—the microprocessor. This incorporated the essential circuitry of a computer onto a single chip, or solid, postage-stamp-like piece of silicon. This model of this microprocessor was finished in 1970. Its compactness and speed soon changed the face of the computing industry, which is today entirely dependent on microprocessors.
The creation of integrated circuits allowed calculators to become much faster and smaller. By the 1960s, they had become hand held; by the mid 1970s, they had become affordable even for high-school students. By the late 1980s, calculators were found on watches (although this design has never been popular because the controls must be so small). Today, some calculators are so complex as to blur the distinction between computer and calculator.
See also Computer, analog; Computer, digital.
Resources
BOOKS
Chandler, Alfred D., Jr. Inventing the Electronic Century:The Epic Story of the Consumer Electronics and Computer Industries. Cambridge, MA: Harvard University Press, 2005.
Macaulay, David and Neil Ardley. The Way Things Work. New York: Dorling Kindersley, 2004.
OTHER
Vintage Calculators Web Museum. “The History of Pocket Electronic Calculators.” <http://www.vintagecalculators.com/html/history_of_electronic_calculat.html> (accessed October 21, 2006).
Mara W. Cohen
Calculator
Calculator
The calculator is a computing machine. Its purpose is to do mathematics ; basic calculators do the basic mathematical functions (addition , subtraction , division , and multiplication ) while the more advanced ones, which are relatively new in the history of computing machines, do advanced calculations such as solving polynomials . The odometer, or mileage counter, in your car is a counting machine as is the calculator in your backpack and the computer on your desk. They may have different ability levels, but they all tally numbers.
The first calculators
Perhaps the earliest calculating machine was the Babylonian rod numerals. Not only was it a notational device, but administrators carried rods of bamboo, ivory, or iron in bags to help with their calculations. Rod numerals used nine digits (Figure 1).
The next invention in counting machines was the abacus . A counting board dating from approximately 500 b.c. now resides in the National Museum in Athens. It is not an abacus, but the precursor of the abacus. Oriental cultures have documents discussing the abacus (the Chinese call it a suan phan and the Japanese the soroban) as early as the 1500s a.d.; however, they were using the abacus at least a thousand years earlier. While experts disagree on the origin of the name (from either the Semitic abq, or dust, or the Greek abax, or sand tray), they do agree the word is based upon the idea of a sand tray which was used for counting.
A simple abacus has rows (or wires) and each row has ten beads. Each row represents a unit ten greater than the previous. Thus, the first stands for units of one; the second, units of ten; the third, units of hundred; and so on. The appropriate number of beads are moved from left to right on the wire representing the unit. When all ten beads on a row have been moved, they are returned to their original place and one bead on the next row is moved. The soroban divides the wires into two unequal parts. The beads along the lower, or larger, part represent units, tens, hundreds, and so on. The bead at the top represent five, fifty, five hundred, and so on. These beads are stored away from the central divider, and as needed are moved toward it.
Finger reckoning must not be ignored as a basic calculator. Nicolaus Rhabda of Smyrna and the Venerable Bede (both of the eighth century a.d.) wrote, in detail, how this system using both hands could represent numbers up to one million. The numbers from 1 to 99 are created by the left and the numbers from 100 to 9,900 by the right hand. By bending the fingers at their various joints and using the index finger and thumb to represent the multiples of ten, combinations of numbers can be represented. Similar systems were devised much earlier than the eighth century, probably by merchants and traders who could not speak each other's languages but needed a system to communicate: their fingers. Multiplication using the fingers came much later; well into the fifteenth century such complex issues, as multiplication, were left to university students, who were forced to learn a different finger reckoning system to accommodate multiplication.
Early calculators
Schickard, a German professor and Protestant minister, seems to have been the first to create an adding machine in the 1620s. It performed addition, subtraction, and carrying through the use of gears and preset multiplication tables. The machine only computed numbers up to six digits. Once the operator surpassed this limit, he was required to put a brass ring around his finger to remind him how many carries he had done. Schickard made sure to include a bell, so that the user would not forget to add his rings. His drawings of the machine were lost until the mid-1960s when a scrap of paper was located inside a friend's book which had a drawing of his machine. With this drawing and Schickard's letters, a reconstruction was made in 1971 to honor the adding machine's 350th anniversary.
Finished in 1642, Blaise Pascal's calculating machine also automatically carried tens and was limited to six digit numbers. The digits from 0 to 9 were represented on dials; when a one was added to a nine the gear turned to show a zero and the next gear, representing the next higher tens unit, automatically turned. Over 50 of these machines were made. A few remain in existence.
Gottfried Wilhelm Leibniz, a German, created a machine in 1671 which did addition and multiplication. For over 200 years this machine was lost; then it was discovered by some workmen in the attic of one of the Göttingen University's buildings.
Charles Xavier Thomas de Colmar, in 1820, devised a machine which added subtraction and division to a Leibniz type calculator. It was the first mass produced calculator and became a common sight in business offices.
Difference engine
The great English mathematician Charles Babbage tried to build a machine which would calculate mathematical tables to 26 significant figures; he called it the Difference Engine. However, in the early 1820s his plans were stalled when the British government pulled its funding. His second attempt, in the 1830s, failed because some of the tools he needed had yet to be invented. Despite this, Babbage did complete part of this second machine, called the Analytical Engine, in the early 1840s. It is considered to be the first modern calculating machine. The difference between the Difference and Analytical Engines is that the first only performed a certain number of functions, which were built into the machine, while the second could be programmed to solve almost any algebraic equation.
By 1840 the first difference engine was finally built by the Swedish father and son team of George and Edvard Scheutz. They based their machine on Babbage's 1834 publication about his experiments. The Scheutz's three machines produced the first automatically created calculation tables. In one 80 hour experiment, the Scheutz's machine produced the logarithms of 1 to 10,000; this included time to reset the machine for the 20 polynomials needed to do the calculations.
Patents
The first patent for a calculating machine was granted to the American Frank Stephen Baldwin in 1875. Baldwin's machine did all four basic mathematical functions and did not need to be reset after each computation. The second patent was given in 1878 to Willgodt Theophile Odhner from Sweden for a machine of similar design to Baldwin's. The modern electronic calculators are based on Baldwin's design.
In 1910, Babbage's son, Henry P. Babbage, built the first hand held printing calculator based on his father's Analytical Engine. With this machine, he was able to calculate and then print multiples of π to 29 decimal places.
In 1936, a German student, Konrad Zuse, built the first automatic calculating machine. Without any knowledge of previous calculating machines, Zuse built the Z1 in his parents' living room. He theorized that the machine had to be able to do the mathematical fundamentals. To do this, he turned to binary mathematics, something no other scientist or mathematician had contemplated. (The mathematics we use in every day life, decimal, is based on 10 digits. Binary mathematics uses two digits: 0 and 1.) By using binary mathematics, the calculating machine became a series of switches rather than gears, because a switch has two options: on (closed or 1) or off (open or 0). He then connected these switches into logic gates, a combination of which can be selected to do addition or subtraction. Zuse's third model (finished in 1941) was not only programmable, but hand held; it added, subtracted, multiplied, divided, discovered square roots, and converted from decimal to binary and back again. However, to add it took a third of a second, and to subtract an additional three to five seconds. By changing his relays into vacuum tubes he believed he could speed his machine up 1,000 times, but he could not get the funding from the Third Reich government to rebuild his machine.
Electronic predecessor to computer
In 1938, the International Business Machine Corporation (IBM) team of George Stibitz and S. B. Williams began building the Complex Number Calculator. It could add, subtract, multiply, and divide complex numbers . They completed the project in 1940 and until 1949 these calculators were used by Bell Laboratories. It was the first machine to use remote stations (terminals, or input units, not next to the computer) and to allow more than one terminal to be used. The operator typed the request onto a teletype machine and the response was sent back to that teletype. The relays inside the machine were basic telephone relays.
The IBM Automatic Sequence Controlled Calculator (ASCC) was based upon Babbage's ideas. This machine, completed in 1944, was built for the United States Navy by Harvard University and IBM. It weighed about 5 tons (10,000 kg) and was 51 ft (15.5 m) long and 8 ft (2.4 m) high. A second, more useful version, was completed in 1948.
The Electronic Numerical Integrator and Computer (ENIAC), also based on Babbage's concepts, was completed in 1945 at the University of Pennsylvania for the United States Army. It weighted over 30 tons (27,240 kg) and filled a 30 by 55 ft (9 by 17 m) room. In one second it could do 5,000 additions, 357 multiplications, or 38 divisions. However, to reprogram ENIAC meant rewiring it, which caused up to two days in delays.
The first electronic calculator was suggested by the Hungarian turned American John von Neumann. Von Neumann introduced the idea of a stored memory for a computer which allowed the program and the data to be inputted to the machine. This resolved problems with rewiring computers and permitted the computer to move directly from one calculation to another. This type of machine architecture is called "von Neumann." The first completed computer to use von Neumann architecture was the English Electronic Delay Storage Automatic Calculator (EDSAC) finished in 1949. An added feature of the EDSAC was that it could be programmed in a type of shorthand, which it then converted into binary code, rather than its precursors which demanded the programmer actually write the program in binary code, a laborious process.
Inside calculators
The early counting machines, items like the car's odometer, work with a set of gears and wheel. A certain number of wheels are divided in 10 equal parts on each of which one of the 10 digits appears. (Windows are placed on top of these wheels so that only one digit appears at a time.) These wheels are then attached to gears which as they turn rotate the wheel so that the digit being displayed changes. When the right most wheel changes from 9 to 0, a mechanism is set in motion which turns the wheel to the left one unit, so that the digit it displays changes. This is the carry sequence. When this second wheel changes from 9 to 0, it too has a mechanism to carry to the next left wheel.
Electronic calculators have the four major units van Neumann created: input, processing, memory, and output. The input unit accepts the numbers keyed in, or sent through the reader in the case of the punch card, by the operator. The processing unit performs the calculations. When the processing unit encounters a complex calculation, it uses the memory unit to store intermediary results or to locate arithmetic instructions. At the completion of the calculation, the final answer is sent to the output unit which informs the operator of the result; this may be through a display, paper, or a combination. Since the calculator thinks in binary, the output unit must convert the result into decimal units.
Modern advances
At the end of 1947, the transistor was invented, eventually making the vacuum tube obsolete. This tiny creation, composed of semiconductors, were much faster and less energy consumptive than the tubes. Problems arose with the connections between the components with size, speed, and reliability as more complex machines needed more complex circuitry which in turn required more components soldered to more boards (the actual board to which the pieces were attached). The next breakthrough came with the invention of the integrated circuit (IC) in 1959 by Texas Instruments (TI) and Fairchild (a semiconductor manufacturing company). The integrated-circuit is akin to a solid mass of transistors, resistors, and capacitors. Again, the speed of computation increased (since the resistance in the circuit was reduced) and the energy required by the machine was decreased. Finally, a computer could fit on a spaceship (they were part of the Apollo computer) or missile. In 1959, an IC cost over $1,000 but by 1965 they were under $10.
Ted Hoff, an electrical engineer for Intel conceived of a radical new concept-the microprocessor. This incorporated the circuitry of the integrated circuit and the programs used in a computer onto a single chip, or piece of silicon. This model of this microprocessor was finished in 1970. The idea of a disposable piece of a calculator was revolutionary. Its compactness and speed changed the face of the computing industry.
The creation of ICs allowed calculators to become much faster and smaller. By the 1960s, they were hand held and affordable. By the late 1980s, calculators were found on watches. However, once again engineers are being blocked by the size factor. The limits are now the size of the chip which in turn limits the speed and programmability of the calculator, or computer.
See also Computer, analog; Computer, digital.
Resources
books
Macaulay, David. The New Way Things Work. Boston: Houghton Mifflin Company, 1998.
Palfreman, Jon, and Doron Swade. The Dream Machine: Exploring the Computer Age. London: BBC Books, 1991.
Williams, Michael R. A History of Computing Technology. Englewood Cliffs, New Jersey: Prentice-Hall, 1985.
Mara W. Cohen
KEY TERMS
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .- Binary
—The base 2 system of counting using two digits: 0 and 1. Each unit is a2 to the n+1 power. For example, the first unit is 20 or 1; the second 21 or 2; the third 22 or 4; the forth 23 or 8; and so forth. Thus 221 in binary is 11011101 or 1 hundreds twenty-eight (27), 1 sixty-four (26), no thirty-twos (25), 1 sixteen (24), 1 eight (23), 1 four (22), no twos (21), and 1 one (20). This unit of counting was invented by Gottfried Leibniz in the 1600s.
- Decimal
—The base 10 system of counting using ten digits: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9. Each unit is 10 to the n+1 power. For example, the first unit is 100 or 1; the second 101 or 10; the third 102 or 100; and so forth. Thus, 221 is 2 hundreds (102), 2 tens (101), and 1 one (100).
- Punch cards
—Made of a heavy cardboard, these rectangular cards have holes punched in them. Each hole is placed in a designated area which the computer then translates into a binary code. A series of such punched cards contain a sequence of events, or a program. The first punch cards were invented by Jacquard, a weaver, who wanted to automate the creation of patterns in his fabric. Thus, his loom was the first machine to use these cards.
Calculator
Calculator
The calculator is a computing machine. Basic calculators perform basic mathematical functions (addition, subtraction, division, and multiplication). More sophisticated calculators can perform functions of the higher-based mathematical branches of trigonometry and calculus. The odometer, or mileage counter, in an automobile is a counting machine, as is the pocket calculator and the personal computer. They may have different ability levels, but they all tally numbers.
Early calculators
Although the abacus, the first tool of calculation, has existed since ancient times, advanced calculating machines did not appear until the early 1600s. Blaise Pascal, a French scientist and philosopher, developed in 1642 the
Pascaline, a machine capable of adding and subtracting nine-digit numbers. Figures were entered by moving numbered wheels linked to each other by gear, similar to a modern automobile's odometer. In 1671, German philosopher and mathematician Gottfried Wilhelm Leibniz improved Pascal's design, creating a machine that performed multiplication. In 1820, Frenchman Charles X. Thomas devised a machine that added subtraction and division to a Leibniz-type calculator. It was the first mass-produced calculator, and it became a common sight in business offices.
Over the next century, mathematicians and inventors improved upon the designs of previous calculating machines. In 1875, American inventor Frank Stephen Baldwin received the first patent for a calculating machine. Baldwin's machine did all four basic mathematical functions and did not need to be reset after each computation. With the need for more accurate record keeping in the business world, calculating machines that used motors to tally larger and larger numbers and mechanisms to print out results on paper were devised. These mechanical machines remained essentially unchanged until the mid-1960s.
Electronic calculators
The integrated circuit chip—tiny, complex electronic circuits on a single chip of silicon—was invented in 1959 by Texas Instruments and Fairchild (a semiconductor manufacturing company). Although integrated circuits allowed calculators to become much faster and smaller, those early electronic calculators were still just adding machines. In 1970, however, the development of the microprocessor—which incorporated the circuitry of the integrated circuit and the entire central processing unit of a computer onto a single chip—changed the computing industry. The microprocessor made pocket-sized, highly sophisticated calculators possible.
Today, pocket calculators with a wide range of functions are available, including programmable calculators that are in effect miniature computers. Some calculators are powered by solar cells in ordinary room light. More than 50 million portable calculators are sold in the United States each year, many for less than $10.
[See also Computer, digital ]
calculator
The dividing line between sophisticated calculators and small personal computers, such as notebooks and pocketbooks, has become less clear-cut; there are significant overlaps in both price and power.