Science and Engineering Indicators
SCIENCE AND ENGINEERING INDICATORS
Science and Engineering Indicators is a term referring to efforts to measure the pursuit, support, and performance of science and engineering on scales that geographically extend from the local to the international. Their goal is usually to help direct policy programs in research, education, and industrial support.. The most prominent and celebrated of these is Science and Engineering Indicators (referred from here on as Indicators) published every two years in the United States by the National Science Board (NSB). NSB is the body that oversees the budget and policies of the National Science Foundation (NSF) and the report itself is prepared by NSF's Science and Resources Directorate.
As an NSF publication, Indicators was conceived after Congress, in 1968, broadened the NSF Charter to include more engineering and social sciences in the agency's support portfolio. Legislators desired a sense of the impact government support for research was having on the "health" of the national research system, and NSF, which already had an active statistics branch, broadened its ambitions to large-scale endeavors.
The first Indicators report was issued in 1972 as simply "Science Indicators" and ever since it has been the worldwide standard reference and model for the statistical treatment of science, engineering, and technology. Engineering appeared in its name in 1986 when the NSF, under Congressional pressure, sharply raised its budget for engineering research and elevated its interest in supporting partnerships between U.S. universities and industry.
No mandate, however, was established for assessing the social and economic impact of science and engineering. Editors of Indicators have been conscious of and curious about returns on government research investment. But they believe the report is already extensive enough and that performance indicators that assess such outcomes are, and always were, imposingly difficult areas to measure. Quantified data will probably always constitute the core of the Indicators endeavor.
As the research system has grown and changed over the years, Indicators has evolved in style, content, and presentation. The 1976 edition, reflecting a relatively simple time in the measurement of science and technology for policy, contained chapters titled "International Indicators of Science and Technology," "Resources for Research and Development," "Resources for Basic Research, Industrial R&D and Innovation," "Science and Engineering Personnel," and "Public Attitudes toward Science and Technology."
By comparison, the more voluminous and finely rendered 2002 edition mirrored the rise of new technologies, the increasing globalization of science and technology, and the wider mingling of corporate, university and government interests. Its chapters included "Elementary and Secondary Education," "Higher Education Science and Engineering," "Science and Engineering Workforce," "Funding and Alliances in U.S. and International Research and Development," "Academic Research and Development," "Industry, Technology, and the Global Marketplace," "Public Attitudes and Public Understanding of Science and Technology," and a special chapter entitled "Significance of Information Technology." By the increasing specificity of the chapter titles it was becoming clear that the Indicators editors were being nudged toward treating the facts and figures of science and engineering as more than self-referential measures of the enterprise.
The 2004 edition extended the publication's reach by introducing a chapter on state-by-state research and development statistics, mainly to reflect the importance states place on science and engineering for their economic development. But as to actual state-by-state outcomes, Indicators once more begged off entering with any sense of resoluteness an area in which statistics are, to them, impossible to gather.
The era of the Internet has improved the currency and relevance of Indicators. NSF has taken advantage of Internet technology by continually updating the data in its interactive online version. Thus, readers can no longer object, as they would in the past, that the publication's data were too out of date to be useful. Their objection was a valid one for scholarship: Upon the date of publication, many of Indicators data were often more than a year out of date.
Identifying exactly what science, engineering, and technology ought to indicate is a subject that is without a consensus but is ripe for speculation, especially in the ethical dimensions of the technical universe. Its chapters draw conclusions and projections, but the publication largely leaves it to the readers to interpret what the numbers mean. One certainty is that Indicators confirms that science and technology have shown huge growth both in complexity and scope since the report was first issued, raising issues related to how scientific and technological change affect, and indeed can improve on, human life.
As an information tool for ethical studies of science and technology, the best that can be said is that Indicators offers mountains of data for the taking—levels of funding by field of study, patent activity by universities, size of university department, and so on. But if the ethical subject is conflict of interest by scientists in universities, for example, Indicators will provide enough data on the extent of private funding for academic research, but offer nothing in the way of, for example, numbers of universities that require their faculties to adhere to a code of behavior in dealings with industry. If the query is numbers of litigation cases between universities and corporations over intellectual property, again, Indicators fails the test.
But on balance, a point can be reached where too much is asked of a report that was always meant to be statistical. Indicators is widely praised, universally used, and admiringly emulated. The problem for users with an interest in ethics and the social sciences is that the publication does not address societal and economic outcomes, leaving the reader with the sense that science mainly looks inward while growing in size and importance worldwide. As for technological growth, the reader has no guidance for judging its relative social benefits.
Science and engineering are such powerful forces for change that their statistical treatment will continue to evolve. Very little systematic research, however, has been done to better reflect the vast ramifications of science and technology on society and economies, raising the issue of what Indicators is in fact supposed to indicate. The Organization of Economic Cooperation and Development in Paris, established after World War II, began such metrics as part of the post-war reconstruction of Europe. The work of that organization continues with its periodic reports on various fields of technology, and their social and economic importance. And, of course, other countries, as mentioned, confidently persist in attempting to measure the social impact of science and technology.
By 2005 every industrial country as well as the twenty-five-member European Union (EU) had issued its own science and engineering indicators. The EU, Japan, and most of the large but less developed countries such as Brazil, India, and China tended to stress the societal dimensions as well as the purely statistical treatment of science and technology. The popularity of Indicators seems to support the notion that science and technology are increasingly indispensable tools of economic progress and that countries more than ever feel the need to keep pace with one another.
WIL LEPKOWSKI
SEE ALSO Education;Social Indicators.
BIBLIOGRAPHY
Godin, Benoıˆt. (2002). Are Statistics Really Useful? Myths and Politics of Science and Technology Indicators. Project on the History and Sociology of S&T Statistics, Paper no. 20, 39 pages. Quebec City, Canada: National Institute of Scientific Research, University of Quebec.
Godin, Benoıˆt. (2003). The Most Cherished Indicator: Gross Domestic Expenditure on R&D (GERD). Project on the History and Sociology of S&T Statistics: Paper no. 22, 25 pages. Quebec City, Canada: National Institute of Scientific Research, University of Quebec.
Science and Public Policy. (1992): 19(5 and 6). Special issue.