Computer science
Computer science is the study of the theoretical foundations of information and computation and how they can be implemented in computer systems.[1][2][3] It is a broad discipline, with many fields. For example, computer programming involves the use of specific programming languages to craft solutions to concrete computational problems. Computer graphics relies on algorithms that help generate and alter visual images synthetically. Computability theory helps us understand what may or may not be computed, using current computers. On a fundamental level, computer science enables us to communicate with a machine, allowing us to translate our thoughts and ideas into machine language, to give instructions that the machine can follow, and to obtain the types of responses we desire.
Computer science has touched practically every aspect of modern-day life. For instance, it has led to the invention of general-purpose computers, for tasks ranging from routine writing and computing to specialized decision making. It has led to the development of the Internet, search engines, e-mail, instant messaging, and e-commerce, bringing about a revolution in our ability to access and communicate information and to conduct financial transactions. By enabling the development of computer graphics and sound systems, it has led to new ways of creating slides, videos, and films. These, in turn, have given birth to new approaches for teaching and learning. For research in various fields, computer science has greatly enhanced the processes of data gathering, storage, and analysis, including the creation of computer models. By fostering the development of computer chips, it has aided in the control of such things as mobile phones, home appliances, security alarms, heating and cooling systems, and space shuttles. In medicine, it has led to the creation of new diagnostic and therapeutic approaches. For national defense, it has led to the development of precision weaponry. Through the development of robots, it has enabled the automation of industrial processes and helped in such tasks as defusing bombs, exploring uncharted territories, and finding disaster victims.
On the down side, knowledge of computer science can also be misused, such as in creating computer viruses, computer hacking, and "phishing" for private information. These activities can lead to huge economic losses, theft of identity and confidential information, and breach of national security. In addition, the fruits of computer science—particularly the Internet and its associated forms of communication—can be used to spread falsehoods, motivate immoral or unethical behavior, or promote acts of terrorism and war. Such misuse can create enormous problems for society.
History
The earliest known tool for computation was the abacus, thought to have been invented in Babylon around 2400 B.C.E. Its original style of usage was by lines drawn in sand with pebbles. In the fifth century B.C.E., Indian grammarian Pāṇini formulated sophisticated rules of grammar for Sanskrit. His work became the forerunner to modern formal language theory and a precursor to computing. Between 200 B.C.E. and 400 C.E., Jaina mathematicians in India invented the logarithm. Much later, in the early sixteenth century, John Napier discovered logarithms for computational purposes, and that was followed by the invention of various calculating tools.
None of the early computational devices were computers in the modern sense. It took considerable advances in mathematics and theory before the first modern computers could be designed. Charles Babbage, called the "father of computing," described the first programmable device—the "analytical engine"—in 1837, more than a century before the first computers were built. His engine, although never successfully constructed, was designed to be programmed—the key feature that set it apart from all preceding devices.
Prior to the 1920s, the term computer was used in referring to a human clerk who performed calculations, usually led by a physicist. Thousands of these clerks, mostly women with a degree in calculus, were employed in commerce, government, and research establishments. After the 1920s, the expression computing machine was applied to any machine that performed the work of a human computer—especially work that involved following a list of mathematical instructions repetitively.
Kurt Gödel, Alonzo Church, and Alan Turing were among the early researchers in the field that came to be called computer science. In 1931, Gödel introduced his "incompleteness theorem," showing that there are limits to what can be proved and disproved within a formal system. Later, Gödel and others defined and described these formal systems.
In 1936, Turing and Church introduced the formalization of an algorithm (set of mathematical instructions), with limits on what can be computed, and a "purely mechanical" model for computing. These topics are covered by what is now called the Church–Turing thesis, which claims that any calculation that is possible can be performed by an algorithm running on a mechanical calculation device (such as an electronic computer), if sufficient time and storage space are available.
Turing, who has been called the "father of computer science," also described the "Turing machine"—a theoretical machine with an infinitely long tape and a read/write head that moves along the tape, changing the values along the way. Clearly, such a machine could never be built, but the model could simulate the computation of algorithms that can be performed on modern computers.
Up to and during the 1930s, electrical engineers built electronic circuits to solve mathematical and logic problems in an ad hoc manner, lacking theoretical rigor. This changed when Claude E. Shannon published his 1937 master's thesis, "A Symbolic Analysis of Relay and Switching Circuits." He recognized that George Boole's work could be used to arrange electromechanical relays (then used in telephone routing switches) to solve logic problems. This concept, using the properties of electrical switches to do logic, is the basic concept that underlies all electronic digital computers. Shannon's thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II.
Shannon went on to found the field of information theory with his 1948 paper on "A Mathematical Theory of Communication." In it, he applied probability theory to the problem of how to best encode the information a sender wants to transmit. This work is one of the theoretical foundations for many areas of study, including data compression and cryptography.
During the 1940s, with the onset of electronic digital equipment, the phrase computing machines gradually gave away to just computers, referring to machines that performed the types of calculations done by human clerks in earlier years.
Over time, as it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general and branched into many subfields, such as artificial intelligence. Computer science began to be established as a distinct academic discipline in the 1960s, with the creation of the first computer science departments and degree programs.[4]
In 1975 Bill Gates cofounded Micro-Soft, later known as Microsoft Corporation, with former classmate Paul Allen. Landing lucrative deals developing the operating systems for the computers of that time, and employing aggressive marketing practices, Microsoft became the largest software company in the world. Currently, its premiere product, the Windows operating system, dominates the market by several orders of magnitude.
One year after Gates founded Microsoft, another young man, Steve Jobs founded Apple Computer Co. with Steve Wozniak. From 1976 onward, Apple led the personal computer market with its Apple I, II, and III lines of desktop computers, until IBM (International Business Machines Corporation) released its IBM-PC in 1980. The rivalry between Apple and Microsoft has continued well into the twenty-first century, with Apple possessing a relatively small portion of the computer market. With computers getting smaller and more powerful, they have become indispensable to modern life, and some are even used in decision-making capacities.
Major achievements
Despite its relatively short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society. These include:
- A formal definition of computation and computability, and proof that there are computationally unsolvable and intractable problems.[5]
- The concept of a programming language, a tool for the precise expression of methodological information at various levels of abstraction[6]
- The invention of general-purpose computers that can aid us in many tasks, including writing, computing, information storage, and decision-making.
- The development of the Internet, search engines, e-mail, instant messaging, digital signatures, and electronic commerce.
- The enhancement of research tools, such as data gathering, storage, and analysis.
- The opportunity to create computer models to simulate climate patterns, ecological trends, changes in traffic volume, and so forth.
- The enabling of new types of scientific research, such as computational physics and computational chemistry.[7]
- The development of precision weaponry, thus drastically lowering collateral damage and minimizing risk for military personnel using the weapons.
- The creation of medical technologies for diagnostic and therapeutic purposes.
- The automation of assembly-line manufacturing, such as for automobiles.
- The use of embedded computer chips that help control such things as mobile phones, home appliances, security alarms, heating and cooling systems, children's toys, and space shuttles.
- The development of robots for such endeavors as scientific testing, defusing bombs, finding disaster victims, and exploration of uncharted territories on Earth and in space. Robots have also enabled the automation of industrial processes.
Relationship with other fields
Despite its name, computer science rarely involves the study of computers themselves. Renowned computer scientist Edsger Dijkstra is often quoted as saying, "Computer science is no more about computers than astronomy is about telescopes." It may be argued that Dijkstra was referring to a computer in a narrow sense—that is, a digital computer. If, however, a computer were defined as "any physical system or mathematical model in which a computation occurs," then the definition of computer science as "the science that studies computers" is broadened beyond the study of digital computers.
The design and deployment of physical computer systems is generally considered the province of disciplines other than computer science. For example, the study of computer hardware is usually considered part of computer engineering, while the study of commercial computer systems and their deployment is often placed under information technology or information systems.
On the other hand, some have criticized computer science as being insufficiently scientific. This view is espoused in the statement "Science is to computer science as hydrodynamics is to plumbing," credited to Stan Kelly-Bootle[8] and others. There has, however, been much cross-fertilization of ideas between the various computer-related disciplines. In addition, computer science research has often crossed into other disciplines, such as artificial intelligence, cognitive science, physics (quantum computing), and linguistics.
Computer science is considered by some to have a much closer relationship with mathematics than many scientific disciplines.[9] Early computer science was strongly influenced by the work of mathematicians such as Kurt Gödel and Alan Turing, and there continues to be a useful interchange of ideas between the two fields in areas such as mathematical logic, category theory, domain theory, and algebra.
The relationship between computer science and software engineering is a contentious issue, further muddied by disputes over what the term "software engineering" means, and how computer science is defined. Some people believe that software engineering is a subset of computer science. Others, including David Parnas, believe that the principal focus of computer science is studying the properties of computation in general, while the principal focus of software engineering is the design of specific computations to achieve practical goals—thus making them different disciplines.[10] Yet others maintain that software cannot be engineered at all.
Fields of computer science
Mathematical foundations
- Cryptography
- Algorithms for protecting private data, including encryption.
- Graph theory
- Foundations for data structures and searching algorithms.
- Mathematical logic
- Boolean logic and other ways of modeling logical queries.
- Type Theory
- Formal analysis of the types of data, and the use of these types to understand properties of programs — especially program safety.
Theories of computation
- Automata theory
- The study of abstract machines and problems they are able to solve.
- Computability theory
- What is calculable with the current models of computers. Proofs developed by Alan Turing and others provide insights into the possibilities of what may be computed and what may not.
- Computational complexity theory
- Fundamental bounds (especially time and storage space) on classes of computations.
Algorithms and data structures
- Analysis of algorithms
- Time requirement and space complexity of algorithms.
- Algorithms
- Formal logical processes used for computation, and the efficiency of these processes.
- Data structures
- The organization of and rules for the manipulation of data.
- Genetic algorithms
- A genetic algorithm is a search technique to find approximate solutions to optimization and search problems.
Programming languages and compilers
- Compilers
- Ways of translating computer programs, usually from higher-level programming languages to lower-level ones. They are based heavily on mathematical logic.
- Programming languages
- Formal language paradigms for expressing algorithms and the properties of these languages, such as the problems they are suited to solve.
Databases
- Data mining
- The study of algorithms for searching and processing information in documents and databases. It is closely related to information retrieval.
Concurrent, parallel, and distributed systems
- Concurrency
- The theory and practice of simultaneous computation and resource allocation.
- Distributed computing
- Computing using multiple computing devices over a network to accomplish a common objective or task.
- Networking
- Algorithms and protocols for reliably communicating data across different shared or dedicated media, often including error correction.
- Parallel computing
- Simultaneous execution of a task on multiple devices to speed up computation time.
Computer architecture
- Computer architecture
- The design, organization, optimization, and verification of a computer system and its component parts, such as the central processing unit (CPU).
- Operating systems
- Systems for managing computer programs and providing the basis of a usable system.
Software engineering
- Computer programming
- Problem solving and its implementation in a programming language.
- Formal methods
- Mathematical approaches for describing and reasoning about software designs.
- Software engineering
- The principles and practice of designing, developing, and testing programs, as well as proper engineering practices. There is, however, considerable debate over the meaning of "software engineering" and whether it is the same thing as "computer programming."
Artificial intelligence
- Artificial intelligence
- The implementation and study of systems that appear to exhibit autonomous intelligence or behavior.
- Automated reasoning
- Study and theory of implementing reasoning capabilities in a computer via software.
- Robotics
- The design and construction of robots and algorithms for controlling the behavior of robots.
- Computer vision
- Algorithms for identifying three-dimensional objects from a two-dimensional picture.
- Machine learning
- Automated creation of a set of rules and axioms based on input.
Computer graphics
- Computer graphics
- Algorithms for generating visual images synthetically, and for integrating or altering visual and spatial information sampled from the real world.
- Image processing
- Determining information from an image through computation.
- Human-computer interactions
- The study and design of computer interfaces that people use.
Scientific computing
- Bioinformatics
- The use of computer science to maintain, analyze, and store biological data, and to assist in solving biological problems such as protein folding.
ReferencesISBN links support NWE through referral fees
- 1998 ACM Computing Classification System. Association for Computing Machinery (1998).
- "Computing Curricula 2001: Computer Science." IEEE Computer Society and the Association for Computing Machinery (December 15, 2001).
Notes
- ↑ "Computer science is the study of information" Department of Computer and Information Science, Guttenberg Information Technologies
- ↑ "Computer science is the study of computation." Computer Science Department, College of Saint Benedict, Saint John's University
- ↑ "Computer Science is the study of all aspects of computer systems, from the theoretical foundations to the very practical aspects of managing large software projects." Massey University
- ↑ Denning, P.J. (2000). Computer science:the discipline. Encyclopedia of Computer Science.
- ↑ Constable, R.L., (March 2000) "Computer Science: Achievements and Challenges circa 2000."
- ↑ "The computer revolution is a revolution in the way we think and in the way we express what we think. The essence of this change is the emergence of what might best be called procedural epistemology—the study of the structure of knowledge from an imperative point of view, as opposed to the more declarative point of view taken by classical mathematical subjects." Quoted in Abelson, H., G. J. Sussman, and J. Sussman. Structure and Interpretation of Computer Programs (2nd ed.) Cambridge, MA: MIT Press, 1996. ISBN 0262011530
- ↑ Constable, R.L. "Nature of the Information Sciences." (1997)
- ↑ Computer Language Oct. 1990.
- ↑ Denning, P.J. (2000). Computer science:the discipline. Encyclopedia of Computer Science.
- ↑ Parnas, David L. (1998). Software Engineering Programmes are not Computer Science Programmes. Annals of Software Engineering 6: 19–37., p. 19: "Rather than treat software engineering as a subfield of computer science, I treat it as an element of the set, {Civil Engineering, Mechanical Engineering, Chemical Engineering, Electrical Engineering,....}."
Credits
New World Encyclopedia writers and editors rewrote and completed the Wikipedia article in accordance with New World Encyclopedia standards. This article abides by terms of the Creative Commons CC-by-sa 3.0 License (CC-by-sa), which may be used and disseminated with proper attribution. Credit is due under the terms of this license that can reference both the New World Encyclopedia contributors and the selfless volunteer contributors of the Wikimedia Foundation. To cite this article click here for a list of acceptable citing formats.The history of earlier contributions by wikipedians is accessible to researchers here:
The history of this article since it was imported to New World Encyclopedia:
Note: Some restrictions may apply to use of individual images which are separately licensed.