Computer Industry
COMPUTER INDUSTRY
Computers have become a useful and necessary part of modern society. They have been used in all types of businesses ranging from mail order and retail sales, to communications such as phone lines and internet access. Computers are prevalent in hospitals and supermarkets, universities and malls, restaurants and government agencies. By 1998, over 40% of all families in the United States had a personal computer.
The earliest type of machine used for computing was the abacus, dating back to possibly 3000 b.c. in Babylon. Still used in the 1990s, it was a simple system of beads that slid on wires. The next major improvement was made by Blaise Pascal (1623–1662) in 1642, when he developed a "mechanical adding machine" he called the Pascaline. In 1694, Gottfried Wilhem von Leibniz (1646–1716) made changes to the Pascaline so it could multiply as well. An Englishman named Charles Babbage (1791–1871) designed the first modern computer. Named the Analytical Engine, it used punched cards. American Herman Hollerith (1860–1929) used the punched card technique to make a machine for use in tabulating results of the U.S. Census in 1890. He founded the Tabulating Machine Co. in 1896, which became International Business Machines (IBM) in 1924. Dr. John V. Atanasoff and assistant Clifford Berry developed the first electronic computer circuits using Boolean algebra in 1940. In 1944, IBM finished the Mark I computer, which used electromagnetic signals.
From this point, computer history was marked by "generations." The first generation of computers featured the use of vacuum tubes, which contributed to their characteristically huge size. Another limitation was their programming language. This period lasted roughly from the late 1940s to the mid-1950s. The second generation, approximately mid- to late-1950s to the early-1960s, saw the use of the transistor instead of the large vacuum tubes. This led to smaller, more efficient, and less costly machines. Improvements in programming language gave them greater flexibility. This generation of hardware generated new jobs in the computer industry such as programmers and software developers.
The third generation, mid-1960s to 1971, was based on the innovation of the semiconductor which replaced transistors, reducing heat and also the size of the computers. Another new development was the use of the operating system, which used one central program to control access to numerous other programs. Also, a new programming language called BASIC was developed by two Dartmouth professors. The fourth generation of computers began in 1971 and continued into the late 1990s with the new development of large-scale integrated circuits. This again reduced the size and price of computers. In 1975, the first personal computer, the Altair 8800, was introduced by Micro Instrumentation and Telemetry Systems (MITS). IBM released its version in 1981 which used the Disk Operating System (DOS) developed by Bill Gates of Microsoft, one of the most prominent software companies. Apple brought out its Macintosh computer in 1984. By 1986 there were over 30 million computers in the United States.
Other important computer businesses that began during the 1980s were Compaq Computers, Sun Microsystems, and Unisys Corporation. In the late 80s, Texas Instruments and Motorola marketed new microprocessors. Microsoft's Windows, 1985, Windows 3.0, 1990, Windows NT, 1993, Windows 95, and Windows 98 became extremely popular operating systems due to the use of graphics which made them easy to use. In 1997, Microsoft's Office 97 for businesses saw sales totaling $78.8 million. Other very popular software in the 1990s were computer games, such as "Riven: The Sequel to Myst," the top seller in 1997.
The Internet or World Wide Web came about due to the efforts of Tim Berners-Lee. In 1989 he helped develop a system of "hyperlinks" that could be used to get access to related information, and by August 1991, that system was being used on the Internet, greatly improving the sharing of data. E-mail was a popular way to exchange messages on the Internet. The number of Internet users grew vastly throughout the 1990s, and by 1998 about 5 million people were using the Web.
Trends in the computer industry in the late-1990s included rental or lease options on computer systems, numerous models of personal computers in the below $1,000 price range, portable laptop computers, and a change in popularity from the large mainframe business computers to a "client/server system" which used a set of smaller, faster, and cheaper computers. Another innovation was "e-commerce," where consumers could browse through on-line catalogs and then place an order. Goods were purchased directly on-line, and banking and investments were controlled through the Internet. By the late-1990s, it was estimated that there were over 400,000 businesses world wide with web sites.
The end of the twentieth century had seen the personal computer become a part of the average citizen's daily life. The demand for workers with computer skills was expected to increase as the computer industry continued to play an important role in the strength of the American economy.
See also: Paul Allen, Steve Case, Bill Gates, Internet, Microsoft, Netscape, Stephen Wozniak
FURTHER READING
Dvorak, John C. "What Ever Happened to . . . the First Computer?" [cited January 12, 1999] available from the World Wide Web @ web3.insitepro.com/insite_pro/sess.
McConnell, Stacy A., ed., and Linda D. Hall, assoc. ed. Dun & Bradstreet/Gale Industry Reference Handbooks: Computers and Software. Detroit: Gale, 1998.
"A Chronology of Computer History," [cited January 12, 1999] available from the World Wide Web (or On Line) @ www.cyberstreet.com/hcs/museum/chron.htm.
"Computers: History and Development," [cited January 12, 1999] available from the World Wide Web (or On Line) @ www.digitalcentury.com/encyclo/update/comp_hd.html.
Stearns, Peter N., and John H. Hinshaw. The ABC– CLIO World History Companion to the Industrial Revolution. Santa Barbara, Ca.: 1996, s.v. "Computers."