Computer: Difference between revisions
imported>Nathan Bloomfield No edit summary |
imported>Nathan Bloomfield No edit summary |
||
Line 1: | Line 1: | ||
{{subpages}} | {{subpages}} | ||
A '''computer''' is any device (mechanical, electronic, or human) capable of manipulating symbols according to a set of rules, sometimes called a [[calculus]]. Although most people associate the word ''computer'' with modern desktop PCs, those machines are only the latest generation of computers to achieve widespread use, and at their core are very much like a child using arithmetic to add numbers together. | A '''computer''' is any device (mechanical, electronic, or human) capable of manipulating symbols according to a set of rules, sometimes called a [[Lambda Calculus|calculus]]. Although most people associate the word ''computer'' with modern desktop PCs, those machines are only the latest generation of computers to achieve widespread use, and at their core are very much like a child using arithmetic to add numbers together. | ||
During World War II, the first electronic '''computers''' (machines that perform numerical calculations far faster than humans) were developed by the British and U. S. governments as a result of secret military projects<ref name="Colossus">{{cite web|url=http://www.picotech.com/applications/colossus.html|title=Colossus: The World’s First Electronic Computer|publisher=Pico Technology|year=date_not_specified|accessdate=2007-04-24}}</ref><ref name="Eniac">{{cite web|url=http://www.seas.upenn.edu/~museum/|title=The ENIAC Museum Online|publisher=University of Pennsylvania School or Engineering and Applied Sciences (SEAS)|year=date_unspecified|accessdate=2007-04-23}}</ref>. These first computers did not remain secret for long; they were adopted by private industry, and they quickly grew in usefulness while decreasing in size and cost. Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word ''computer'' may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities. | During World War II, the first electronic '''computers''' (machines that perform numerical calculations far faster than humans) were developed by the British and U. S. governments as a result of secret military projects<ref name="Colossus">{{cite web|url=http://www.picotech.com/applications/colossus.html|title=Colossus: The World’s First Electronic Computer|publisher=Pico Technology|year=date_not_specified|accessdate=2007-04-24}}</ref><ref name="Eniac">{{cite web|url=http://www.seas.upenn.edu/~museum/|title=The ENIAC Museum Online|publisher=University of Pennsylvania School or Engineering and Applied Sciences (SEAS)|year=date_unspecified|accessdate=2007-04-23}}</ref>. These first computers did not remain secret for long; they were adopted by private industry, and they quickly grew in usefulness while decreasing in size and cost. Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word ''computer'' may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities. |
Revision as of 15:54, 7 November 2007
A computer is any device (mechanical, electronic, or human) capable of manipulating symbols according to a set of rules, sometimes called a calculus. Although most people associate the word computer with modern desktop PCs, those machines are only the latest generation of computers to achieve widespread use, and at their core are very much like a child using arithmetic to add numbers together.
During World War II, the first electronic computers (machines that perform numerical calculations far faster than humans) were developed by the British and U. S. governments as a result of secret military projects[1][2]. These first computers did not remain secret for long; they were adopted by private industry, and they quickly grew in usefulness while decreasing in size and cost. Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word computer may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities.
The history of computing is complex, and until well into the 20th century, the word computer meant a person performing a computation. The desire for easy obtainable results for ever more complex computations had existed for a long time, but technology was net yet advanced enough to realize a practical solution for that. People had hankered after mechanical devices to help with mathematical calculations, inventing the abacus[3], the slide rule[4], and a host of mechanical adding machines[5]. But the electronic computer's rapid evolution forever changed science, the military, and business. The electronic computer has vastly expanded human ability to store and share information; as such, its invention may be a milestone for humanity on a par with the advent of writing and materials to write on (millennia ago)[6], or with the invention of the printing press (~1450)[7]. Not all of this may be regarded as positive, however; the explosive intrusion in life of the computer in all its facets is sometimes referred to as the digital revolution[8].
The nature of computing
Some people define a computer as a machine for manipulating data according to instructions known as a program. However, this definition may only make sense to people who already know what a computer can do. Computers are extremely versatile. In fact, they are universal information-processing machines, but at the deepest level, what they really do is perform arithmetic. Computers and mathematics are closely related. The theory of computation is a branch of mathematics, and its evolution, pioneered by brilliant twentieth-century mathematicians such as Alan Turing (among many others), enabled the invention of electronic computers. And as usual in mathematics, their work built on that of earlier mathematicians as described in the history of computing.
Today, most computers do arithmetic using the binary numeral system, because a binary number can be represented by an array of on-off switches, with each 0 or 1 digit, or bit, stored in one switch. In early electronic computers, the switches used for each digit were electromagnetic switches, also called relays. Later, vacuum tubes replaced electronic relays, and eventually transistors replaced both relays and tubes. Transistors can now be manufactured as tiny devices, almost molecular in size, embedded within silicon chips. These tiny transistorized computers work on the same principles as the first, giant relay and vacuum tube based computers (which occupied entire buildings)[9]. More information on how electronic computers work is available in computer architecture.
Initially, mathematicians and scientists were the only users of computers. But today, what we tend to think of as a computer consists not only of the underlying hardware, with its limited instruction set that performs arithmetic, but also an operating system, which is a set of programs which allow people to use the computer more easily. The operating system is software (programs running on a computer). Without an operating system, a computer is not useful; the operating system helps people to write new programs for the computer and to perform many other activities on a computer.
Academia and professional societies
Since the early 1980's, most universities have offered majors in academic disciplines such as computer science or computer engineering, devoted to the design of hardware and software for computers. These general fields of study soon came to consist of many sub-fields. In addition, most academic disciplines, and most businesses, use computers as tools.
Below are some of the professional and academic disciplines that teach the techniques to construct, program, and use computers. There is often overlap of functions and terminology across these categories:
- artificial intelligence or machine learning (two sub-fields for solving difficult problems in software)
- computer architecture (the study of how computers work, and how specific computers can be built)
- compilers (writing programs that allow people to use a programming language)
- computer engineering (a branch of electrical engineering that focuses both on hardware and operating system design)
- computer science (the academic study of computers and computation, including aspects of both theory and implementation)
- geographic information systems (combining latitude and longitude information with computer mapping programs)
- information systems or information technology (study of computer systems, usually in a business or organizational context)
- machine translation (software for translating one natural language into another)
- programming languages (specifications for how people ought to write computer programs)
- software engineering (management of the process of creating complex software systems)
Professional societies dedicated to computers include the British Computer Society, the Association for Computing Machinery (ACM) and the IEEE Computer Society.
The economics of the computer industry
Since the 1950's, a vigorous cycle of business activity has arisen from the development of computers, including many corporations engaged in creating computer hardware, operating systems, or other software. The business climate has evolved rapidly along with the technology, with some companies being born and meeting their demise in rapid succession, while other companies survived for decades (though usually by changing their focus rapidly in response to industry growth).
The importance of standards
The ability of many different companies to make computer parts, hardware or software, comes from industry-wide adoption of standards. Various consortiums and United States or international standards organizations serve as arbitrators of computing standards, including ANSI, WC3, ECMA and ISO. In addition to formal standards, many informal standards have arisen due to consumer "voting" by purchasing certain products. The first written standards arose from the Internet Engineering Taskforce (IETF)[10], born in the late 1960's as a result of the U. S. Defense Advanced Research (DARPA) initiative, and leading eventually to the development of the internet. The open nature of the IETF, in which any person could submit a proposal (called a Request for Comment, or RFC) was remarkable, and the IETF proved to be about as effective as formally endorsed standards bodies at creating usable and widely adopted standards. The non-proprietary nature of the RFC process also foreshadowed the later development, in the 1980's, of the open source software movement. Some standards also resulted from a deliberate sharing of specifications by industry participants, notably the open specifications leading to the industry-wide IBM compatible PC beginning in the early 1980's.
Pace of growth (Moore's law)
The quick pace of growth in computer engineering was codified into a widely quoted rule of thumb, called Moore's law[11], first publicized by Gordon Moore (for many years CEO of Intel). For decades after the invention of the computer, this economic boom centered in the United States and led to the widespread availability of personal computers (affordable by individuals) in the 1980's. Beginning in the 1990's, the computer industry also spread rapidly overseas, especially into Europe, Russia, China and India. Computers are now a world-wide phenomenon.
References
- ↑ Colossus: The World’s First Electronic Computer. Pico Technology (date_not_specified). Retrieved on 2007-04-24.
- ↑ The ENIAC Museum Online. University of Pennsylvania School or Engineering and Applied Sciences (SEAS) (date_unspecified). Retrieved on 2007-04-23.
- ↑ Origin and Development of the Chinese Abacus. Journal of the ACM (JACM) Volume 6 , Issue 1 (January 1959) Pages: 102 - 110 (1959). Retrieved on 2007-04-24.
- ↑ Slide Rule History by the Oughtred Society (2006). Retrieved on 2007-04-24.
- ↑ Adding Machines by The Museum of HP Calculators, text and images Copyright David G. Hicks, 1995 - 2005 (2005). Retrieved on 2007-04-24.
- ↑ The Invention of Paper Copyright © 2004 Wisconsin Paper Council (2004). Retrieved on 2007-04-24.
- ↑ The Printing Press by The History Guide copyright © 2000 Steven Kreis (2004). Retrieved on 2007-04-24.
- ↑ The Digital Revolution, the Informed Citizen, and the Culture of Democracy by Henry Jenkins and David Thorburn (from the introduction to Democracy and New Media, Cambridge: MIT Press, 2003). MIT Press (2003). Retrieved on 2007-04-24.
- ↑ The History of the Integrated Circuit: The Transistor vs. the Vacuum Tube (The Nobel Foundation) Copyright © Nobel Web AB 2007 (2007). Retrieved on 2007-04-24.
- ↑ "IETF: History, Background, and Role in Today's Internet". Gary C. Kessler (1996). Retrieved on 2007-04-23.
- ↑ Moore's Law © Intel Corporation. Intel Corporation (date_unknown). Retrieved on 2007-04-23.