The Return of Rigor to Mathematics
The Return of Rigor to Mathematics
Overview
Rigor was a characteristic of mathematics going back to Greek times. For much of the Renaissance and the Enlightenment mathematics in general and calculus in particular were more about problem solving than about proving with logical exactness the correctness of theorems. The nineteenth century saw the return of logical rigor to mathematics, accompanied by a more thorough investigation of the foundations of mathematics than had been attempted previously, thanks largely to progress in mathematical logic.
Background
One of the great works of Greek mathematics to come down to the world of scholarship after the Middle Ages was the long-awaited translation of the Elements of Euclid of Alexandria (fl. c. 300 b.c.), which arranged the results of geometry in a sequence of definitions, axioms, postulates, and theorems, where each of the theorems was proved using the earlier theorems. Thanks to the arrangements of the results in the Elements, students could have complete confidence in geometrical theorems, although they had to worry about how to apply them to the world of concrete objects. Other disciplines sought to emulate the form of Euclid's work in an effort to capture the same aspect of certainty, but the attempt was never carried out with equal completeness.
Mathematicians of the Renaissance, however, found themselves more occupied with solving problems than with trying to prove the results. As a result, the confidence in the methods being used came from giving correct results rather than from basing the methods on theorems proved earlier. Perhaps the culmination of this approach was the invention of the calculus in the seventeenth century by Sir Isaac Newton (1642-1727) and Gottfried Wilhelm Leibniz (1646-1716). The calculus, which was devoted to the study of change and the finding of volumes, proved to be an invaluable tool for settling all sorts of questions that the geometry of Euclid was unable to solve. What was disturbing was that neither Newton nor Leibniz nor any of their followers was able to come up with an explanation for why the methods of the calculus worked. They referred to "infinitely small" quantities without explaining what they were. The attempts to lay a foundation for the calculus always involved an appeal to intuition, whether physical or mathematical, rather than a sequence of persuasive arguments.
In the next century Leonhard Euler (1707-1783) extended many of the results that Newton and Leibniz had obtained, but he was no more successful in providing a convincing explanation for why the impressive edifice of calculus worked. One of Euler's chief subjects of research was the use of series to represent functions like the trigonometric and exponential. Others who tried to follow Euler's work found themselves writing down meaningless statements in the absence of any basic principles to serve as a guide for the study of such series. Some of Euler's work had looked puzzling, but it turned out to be useful nonetheless. The mathematicians who tried to continue his work could not figure out what distinguished sense from nonsense in the foundations of the calculus.
By the nineteenth century there was a widespread conviction that mathematics in general needed more secure foundations. Those responsible for writing textbooks for students of the calculus felt an obligation to make the material convincing as well as comprehensible. It was hard to require students to avoid making nonsensical statements when mathematicians doing research were unable to avoid the same trap. The physics community was not likely to have confidence in the statements coming from mathematics if the statements looked irrelevant to the physical objects being studied.
Impact
One of the first breakthroughs in restoring the level of rigor that had long been lost to mathematics was a paper written by Carl Friedrich Gauss (1777-1855), perhaps the greatest mathematician of the century. Gauss's paper dealt with the hypergeometric series, a particular sort of representation of a function, but what made it so important was his great attention to the convergence of the series he discussed. A series is convergent if it can be summed to a finite value, while a divergent series goes off to infinity. All sorts of problems arose from the appearance of infinity in mathematics, whether the infinitely large or the infinitely small Gauss made sure that the infinitely large would not be a problem for the sort of series that he was investigating.
The individual who did the most to put mathematics back on a rigorous footing was Augustin-Louis Cauchy (1789-1857), a writer who contributed an immense amount to the teaching of mathematics as well as to mathematical research. Cauchy was the first to take the basic objects of the calculus—the derivative (a way of expressing the rate of change of a function) and the integral (an expression for the area under the graph of a function)—and present them in a way that did not require the sort of appeal to the infinite that had been present since the time of Newton. What Cauchy did was to make the idea of a limit central to the definitions of derivative and integral and to base limits on the way real numbers behaved. Although he did not have an explicit picture of the basis for the real numbers, this was already a great improvement on the kind of slipshod argument prevalent previously.
On the strength of the notion of limit, Cauchy was able to make precise what it meant for a function to be continuous. From an intuitive point of view, a function is continuous if its graph can be drawn without lifting one's pencil from the paper. This is not helpful in trying to characterize which functions are continuous, and the class of continuous functions was of central importance in applying the calculus. Many of the fundamental theorems of calculus only apply to continuous functions, so Cauchy's definitions proved to be essential in identifying them. In the same way, Cauchy was able to make the idea of a convergent sequence more explicit than it had been in the work of earlier writers. The work of Newton and Leibniz was justified by Cauchy's efforts, as he had finally succeeded in giving a basis for their use of calculus.
After Cauchy, work on making mathematics even more rigorous proceeded in two directions. One of these was led by Karl Theodor Wilhelm Weierstrass (1815-1897), who recognized that there were important functions that did not behave as well as some of those Cauchy studied. It had been typical to think of a function as represented by a smooth curve, but physicists as well as mathematicians recognized the benefits of studying curves that had jumps or sharp corners. Weierstrass took Cauchy's definitions and made them even more straightforward to apply, removing appeal to anything beyond arithmetic. As an influential teacher, Weierstrass was able to bring this arithmetization of calculus to a generation of students and created the approach to the objects of calculus like limits, derivatives, and integrals used today.
The other direction for researchers was to understand the real numbers themselves better. One of the leaders of this movement was Richard Dedekind (1831-1916), author of the book What Are and What Should Numbers Be? Dedekind was aware that even Cauchy's careful study of the foundations of calculus depended on the underlying numbers, which had received less than rigorous attention even from the Greeks. The Greek mathematicians had put geometrical objects at the center of mathematics and used lengths to represent numbers. Dedekind developed a sophisticated account of ordinary real numbers by starting with fractions and regarding the irrational (nonfractional) numbers as the dividing lines between sets of fractions. By use of this method of cuts, Dedekind was able to prove results about the numbers that had been only presented by intuition before.
The whole theory of sets was given a new foundation by Georg Cantor (1845-1918), the mathematician who did the most to make the infinite respectable again after all the years of problems it had created. Cantor presented a theory of infinite numbers with an explanation of when two infinite sets had the same number of members. This also allowed him to talk about two infinite sets being different in size, which had been perplexing to previous students of the infinite. Cantor's work was not uniformly popular with those who felt that the infinite had no place in mathematics, but it has proved central in going beyond the limits of the functions known and studied before the infinite was tamed.
The work of Dedekind and Cantor formed part of the new discipline that was to be given the name of mathematical logic. It received that name for two reasons: it was the study of logic as used in laying the foundations of mathematics, and it involved mathematics in studying logic itself. Well into the nineteenth century the study of logic was restricted to the work of Aristotle (384-322 b.c.), or at best the commentaries it continued to received. The philosophical sophistication of Aristotelian logic was substantial, but it was not devised particularly with mathematics in mind. English mathematician George Boole (1815-1864) undertook a mathematical analysis of logic that went beyond the limits of Aristotelian logic and drew on some of the ideas current in the study of algebra. These new techniques gave rise to the capacity to study the foundations of mathematics and proofs themselves in detail. By the end of the century there were several ambitious attempts to provide a rigorous foundation for all of mathematics, although subsequent work was to point out why they were doomed to fall short.
The revolution in rigor that took place in mathematics in the nineteenth century was a response to the problems that had been encountered in teaching and applying calculus for more than a hundred years. It was essential to broaden the range of objects to which calculus could be applied, but that was scarcely possible if the calculus itself were so little understood. The movement in the direction of greater rigor enabled mathematics to have more to say about physics in the twentieth century than it ever had before. In addition, the detailed study of mathematical objects and proofs led to the development of mathematical logic, an essential tool on the road to computer science. Even those who had only wanted a return to Greek rigor as practiced by Euclid were the beneficiaries of the techniques that went well beyond the elementary.
THOMAS DRUCKER
Further Reading
Bottazzini, U. The Higher Calculus: A History of Real and Complex Analysis from Euler to Weierstrass. New York: Springer-Verlag, 1986.
Grabiner, Judith V. The Origins of Cauchy's Rigorous Calculus. Cambridge, MA: MIT Press, 1981.
Grattan-Guinness, Ivor. The Development of the Foundations of Mathematical Analysis from Euler to Riemann. Cambridge, MA: MIT Press, 1970.
Hawkins, Thomas W., Jr. Lebesgue's Theory of Integration: Its Origin and Development. Madison, WI: University of Wisconsin Press, 1970.
Kitcher, Philip. The Nature of Mathematical Knowledge. Oxford: Oxford University Press, 1983.
Tiles, Mary. Mathematics and the Image of Reason. London: Routledge, 1991.