Computer Science

Summary

Computer science is the study of real and imagined computers, their hardware and software components, and their theoretical basis and application. Almost every aspect of modern life involves computers in some way. Computers allow people to communicate with people almost anywhere in the world. They control machines, from industrial assembly equipment to children's toys. Computers control worldwide stock market transactions and analyze and regulate those markets. They allow physicians to treat patients even when the physicians and patients cannot meet in the same location. Researchers endeavor to make the computers of science fiction everyday realities.

Definition and Basic Principles

Computer science studies all aspects of computers, applied and theoretical. However, considerable disagreement exists over the definition of such basic terms as "computer science," "computer," "hardware," and "software." This disagreement can be seen as a testament to the vitality and relative youth of this field. The Association for Computing Machinery's computing classification system, developed in 1998, is an attempt to define computer science.

89250407-78396.jpg

The science part of computer science refers to the underlying theoretical basis of computers. Broadly speaking, computation theory, part of mathematics, looks at what mathematical problems are solvable. In computer science, it focuses on which problems can be solved using a computer. In 1936, English mathematician Alan Turing attempted to determine the limits of mechanical computation using a theoretical device, now called a Turing machine. Mathematics also forms the basis for research in programming languages, artificial languages developed for computers. Because computers do not have the ability to think like humans, these languages are very formal, with strict rules on how programs using these languages can be written and used.

Another part of the underlying structure of computer science is engineering. The physical design of computers involves a number of disciplines, including electrical engineering and physics. The quest for smaller, faster, more powerful devices has led to research in fields such as quantum physics and nanotechnology.

Computer science is not just programming. Computer scientists view programming languages as tools to research such issues as how to create better programs, how information is represented and used by computers, and how to do away with programming languages altogether and instead use natural languages (those used by people).

Background and History

Computer science can be seen as the coming together of two separate strands of development. The older strand is technology and machines, and the newer is the theoretical one.

Technology. The twentieth century saw the rapid rise of computers. Computers began to incorporate electronic components instead of the mechanical components that had gone into earlier machines, and they made use of the binary system rather than the decimal system. During World II, major developments in computer science arose from the attempt to build a computer to control antiaircraft artillery. For this project, the Hungarian mathematician John von Neumann wrote a highly influential draft paper on the design of computers. He described a computer architecture in which a program was stored along with data. The decades after World War II saw the development of programming languages and more sophisticated systems, including computer networks, which allow computers to communicate with one another.

Theory. The theoretical development of computer science has been primarily through mathematics. One early issue was how to solve various mathematical equations. In the eighteenth and nineteenth centuries, this blossomed into research on computation theory. At a conference in Germany in 1936 to investigate these issues, Turing presented the Turing machine concept.

The World War II antiaircraft project resulted not only in the development of hardware but also in extensive research on the theory behind what was being done as well as what else could be done. Out of this ferment eventually came such work as Norbert Wiener's cybernetics and Claude Shannon's information theory. Modern computer science is an outgrowth of all this work, which continues worldwide in industry, government, academia, and various organizations.

How It Works

Computer Organization and Architecture. The most familiar computers are stand-alone devices based on the architecture that von Neumann sketched out in his draft report. The main processor in the computer contains the central processing unit, which controls the device. The processor also has arithmetic processing capabilities.

Electronic memory is used to store the operating system that controls the computer, numerous other computer programs, and the data needed for running programs. Although electronic memory takes several forms, the most common is random access memory (RAM), which, in terms of size, typically makes up most of a computer's electronic memory. Because electronic memory is cleared when the computer is turned off, some permanent storage devices, such as hard drives and flash drives, were developed to retain these data.

Computers have an input/output (I/O) system, which communicates with humans and with other devices, including other computers. I/O devices include keyboards, monitors, printers, and speakers.

The instructions that computers follow are programs written in an artificial programming language. Different kinds of programming languages are used in a computer. Machine language, the only language that computers understand, is used in the processor and is composed solely of the binary digits 0 and 1.

Computers often have subsidiary processors that take some of the processing burden away from the main processor. For example, when directed by the main processor, a video processor can handle the actual placing of images on a monitor. This is an example of parallel processing, in which more than one action is performed simultaneously by a computer. Parallel processing allows the main processor to do one task while the subsidiary processors handle others. A number of computers use more than one main processor. Although one might think that doubling the number of processors doubles the processing speed, various problems, such as contention for the same memory, considerably reduce this advantage. Individual processors also use parallel processing to speed up processing. For example, some processors use multiple cores that together act much like individual processors.

Modern computers do not actually think. They are best at performing simple, repetitious operations incredibly fast and accurately. Humans can get bored and make mistakes, but usually not computers.

Mathematics. Mathematics underlies a number of areas of computer science, including programming languages. Researchers have long believed that programming languages incorporating rules that are mathematically based will lead to programs that contain fewer errors. Further, if programs are mathematically based, then it should be possible to test programs without running them. Logic could be used to deduce whether the program works. This process is referred to as program proving.

Mathematics also underlies a number of algorithms that are used in computers. For example, computer games have a large math and physics component. Game action is often expressed in mathematical formulas that must be calculated as the game progresses. Different algorithms that perform the same task can be evaluated through an analysis of the relative efficiencies of different approaches to a problem solution. This analysis is mathematically based and independent of an actual computer.

Software. Computer applications are typically large software systems composed of a number of different programs. For example, a word-processing program might have a core set of programs along with programs for dictionaries, formatting, and specialized tasks. Applications depend on other software for a number of tasks. Printing, for example, is usually handled by the operating system. This way, all applications do not have to create their own basic printing programs. The application notifies the operating system (OS), which can respond, for example, with a list of all available printers. If an application user wishes to modify printer settings, the OS contacts the printer software, which can then display the settings. This way, the same interface is always used. To print the application, the user sends the work to the OS, which sends the work to a program that acts as a translator between the OS and the printer. This translator program is called a driver. Drivers work with the OS but are not part of it, allowing for new drivers to be developed and installed after an OS is released.

Machine language, since it consists only of 0s and 1s, is difficult for people to readily interpret. For human convenience, assembly language, which uses mnemonics rather than digits, was developed. Because computers cannot understand this language, a program called an assembler is used to translate assembly language to machine language. For higher-level languages such as C++ and Java, a compiler program is used to translate the language statements first to assembly language and then to machine language.

Applications and Products

Computer technology has penetrated nearly every area of modern life. The most important computer uses include communications, education, digitization, and security.

Communication. In the early twentieth century, when someone immigrated to the United States, they knew that communication with those whom they were leaving behind would probably be limited. In the modern world, people across the world can almost instantly communicate with anyone else as long as the infrastructure is available. Products such as Skype and Vonage allow people to both see and talk to other people throughout the world using the internet. Instead of traveling to the other person's location, people can hold meetings through software such as Cisco's WebEx or Zoom, which allows for face-to-face meetings.

One of the most far-reaching computer applications is the Internet, a computer network that includes the World Wide Web. People in the developed world rely heavily on the Internet to provide them with news and information, and traditional news sources, such as printed newspapers and magazines, are declining in popularity. Popular twentieth-century technologies such as radio and television have also been affected. The Internet provides entertainment, information, telephone, mail, business services, and shopping.

Computer science also revolutionized telephones and the way people think of telephone communication. Modern telephones are portable computing devices. Smartphones such as the Apple iPhone and Samsung Galaxy provide internet access, email, Global Positioning Systems (GPS), music, movies, streaming platforms, cameras, and substantial storage space. Applications (apps) are a burgeoning industry for smartphones. Apps range from entertaining applications to useful information, such as weather and medical data. Traditional landline telephones became relatively rare by the 2020s.

Smartphones demonstrate convergence, a trend toward combining devices used for individual activities into a single device that performs various activities and services. For example, most high-definition televisions have an internet connection. Devices such as Roku and Amazon Firestick allow users to control and watch television on their desktop, laptop, or mobile phone. Digital video recorders (DVRs) allow users to record a television program and watch it later or pause live broadcasts. DVRs can be programmed to record shows based on the owner's specifications. Although televisions and DVRs are not computers, they contain microprocessors, which are miniature computers. By the late 2010s, smart TVs, or televisions with integrated internet and a preloaded operating system—essentially converging computers, televisions, and digital media players into one device—had become commonplace.

Networking is possible through a vast network infrastructure that is always being extended and improved. Companies such as Belkin provide the cable, Cisco the equipment, and Microsoft the software that supports this infrastructure for providers and users. It is common for homes to have wireless networks with a number of different devices connected through a modem, which pulls out the internet signal from the internet service provider (ISP), and a router, which manages the network devices.

Education. Distance education through the Internet (online education) is becoming more and more commonplace. Learning (or course) management systems (LMS or CMS) are the means by which many courses are delivered. These can be proprietary products such as Blackboard or D2L, or nonproprietary applications such as Moodle. Through these management systems, students can take entire courses without ever meeting their instructor in person. Applications such as Wimba allow classes to be almost as interactive as traditional classes and record those sessions for future viewing. Those participating can be anywhere as long as they can connect to the internet. Through such software, students can ask questions of an instructor, student groups can meet online, and an instructor can hold review sessions. This technology was never more essential as during the COVID-19 pandemic that started in 2020. As schools moved to virtual-only models at the height of the pandemic, students learned solely through online means, relying on services such as Google Classroom and applications like Seesaw to receive their education.

Digitization. This area is concerned with how data are translated for computer usage. Real-world experience is a continuous phenomenon. Long-play vinyl records (LPs) and magnetic tapes captured a direct analogy for the original sound. Any change to the LP or tape meant a change to the sound quality, usually introducing distortions. Each copy (generation) made of that analog signal introduced further distortions. LPs were several generations removed from the original recording, and additional distortions were added to accommodate the sound of the LP medium.

Rather than record an analog of the original sound, digitization translates the sound into discrete samples, like snapshots of the original. The more samples (the higher the sampling rate) and the more information captured by each sample (the greater the bit depth), the better the result. Digitization enables the storage of data of all types. The samples are encoded into one of the many binary file formats, some open and some proprietary. MPEG-1 Audio Layer 3 (MP3) is an open encoding standard used in many devices. It is a "lossy" standard, meaning the result is lower quality than the original. This tradeoff is made to gain a smaller file size, enabling more files to be stored. The multimedia equivalent is MPEG-4 Part 14 (MP4), also a lossy format. Lossless formats like Free Lossless Audio Codec (FLAC) and Apple Lossless have better sound quality but require more storage space.

These formats and others have resulted in a proliferation of devices. Apple's iPods and MP3 digital audio players were popular in the early twenty-first century. They provided portable sound in a relatively small container and, in some cases, played videos. Apple produced iPods from 2001 to 2022, selling over 450 million products. Though both media payer types enjoyed nearly two decades of exceptional popularity, they were replaced by smartphones.

Digital cameras made traditional film cameras functionally obsolete in a relatively short time. Camera users have the choice of better-quality (higher-resolution) photographs that require more memory space or lower-resolution photos, which are smaller files. Digital cameras in smartphones made stand-alone digital cameras less common. Similarly, DVDs made videotapes obsolete. Blu-ray, developed by Sony, allowed for higher-quality images and better sound using a purple laser rather than the standard red. Purple light has a higher frequency than red light, which means that the size (wavelength) of purple light is smaller than red light. This can be visualized as allowing the purple laser to get into smaller spaces than the red laser can. Thus, the 0s and 1s on a Blu-ray disc can be smaller than those on a standard DVD so that more data can be stored on the same size disc. Because the recordings are encoded in binary, they can be quickly and exactly copied and manipulated without any distortion unless distortion is deliberately introduced, as with lossy file formats. Likewise, DVDs and Blu-ray players began being replaced by streaming services that shifted the way consumers buy and consume media content.

Security. The explosion of digital communication and products has caused some individuals to illegally, or at least unethically, exploit people's dependence on these technologies. These people write malware, programs designed to harm or compromise devices and send spam (digital junk mail). Malware includes viruses (programs that embed themselves in other programs and then make copies of themselves and spread), worms (programs that act like viruses but do not have to be embedded in another program), Trojan horses (malicious programs, often unknowingly downloaded with something else), ransomware (programs that encrypt an entire network's files and demand a cryptocurrency payment to decrypt them), and key loggers (programs that record all keystrokes and mouse movements, which might include user names and passwords). The keystroke recorder is a type of spyware, programs that collect information about the device that they are on and those using it without their knowledge. These are often associated with internet programs that keep track of people's browsing habits.

Spam can include phishing schemes, in which a person sends an email that appears to come from a well-known organization such as a bank. The email typically states that there is a problem with the recipient's account and to rectify it, the recipient must verify their identity by providing sensitive information like their user name, password, and account number. The information, if given, is used to commit fraud using the recipient's identity.

Another type of spam is an advance-fee fraud, in which the sender of an email asks the recipient to help him or her claim a substantial sum of money that is due to the sender. The catch is that the recipient must first send money to “show good faith.” The money that the recipient is to get in return never arrives. Another asks the recipient to cash a check and send the scammer part of the money. By the time the check is determined to be worthless, the recipient has sent the scammer a considerable sum of money. These scams are a form of social engineering, in which the scammer first obtains the trust of the email recipient so that he or she will help the scammer.

These problems are countered by companies such as Symantec and McAfee, which produce security suites that contain programs to manage these threats. These companies keep databases of threats and require that their programs be regularly updated to be able to combat the latest threats. Part of these suites is a firewall, which is designed to keep the malware and spam from reaching the protected network.

Careers and Course Work

Computer science degrees require mathematics courses, including calculus, differential equations, and discrete mathematics. Physics, chemistry, and standard liberal arts courses are also required. Lower-division computer science courses are usually heavily weighted toward programming. These might include basic programming courses in C++ or Java, data structures, and assembly language. Other courses usually include computer architecture, networks, operating systems, and software engineering. Several specialty areas, such as computer engineering, affect the exact mix of coursework.

Careers in software engineering typically require a bachelor's degree in computer science. The US Bureau of Labor Statistics predicted much higher-than-average growth in the need for software developers throughout the early and mid-twenty-first century as the demand for computer software continues to increase. Computer skills are not all that employers want, however. They also want employees to understand how technology fits in with their organization and its mission.

Those who wish to conduct computer science research will generally require a doctorate in computer science or some branch of computer science. Computer scientists are often employed by universities, government agencies, and private industries such as IBM. AT&T Labs also has a tradition of Nobel Prize-winning research and was where the UNIX operating system and the C and C++ programming languages were developed.

Social Context and Future Prospects

Computers are an essential part of modern life, and their influence continues to revolutionize society, as was evidenced during the coronavirus pandemic and the ensuing reliance on computers for work and education needs for much of the global population. The advantages and disadvantages of an always-wired society are being keenly debated. People are connected and can be located by computer devices with great accuracy. As people do more online, they increasingly create an online record of their activities. Many employers routinely search online for information on job candidates and may uncover photos or comments that are many years old. As users become aware of these pitfalls, they are taking actions such as deleting unprofessional items and changing settings on social network sites such as Facebook to limit access.

These issues raise privacy concerns, including what an individual's privacy rights are on the internet and who owns the data being produced. For example, in 1976, the US Supreme Court ruled that financial records are the property of the financial institution, not the customer. This suggests that a company that collects information about an individual's online browsing habits—not the individual whose habits are being recorded—owns that information. Most of these companies state that individuals are not associated with the data they sell, and all data are used in aggregate. Still, the capacity to link data with specific individuals and sell the information exists. Other questions include whether an employer has a right to know whether an employee visits a questionable website. These concerns lead to new and ever-evolving laws and regulations concerning internet use and data privacy.

Bibliography

Andrews, Bill. "The Year in Computer Science." Quanta, 20 Dec. 2023, www.quantamagazine.org/the-biggest-discoveries-in-computer-science-in-2023-20231220. Accessed 11 June 2024.

Brooks, Frederick P., Jr. The Mythical Man-Month: Essays on Software Engineering. Anniversary ed., Addison-Wesley, 2008.

Dale, Nell, and John Lewis. Computer Science Illuminated. 8th ed., Jones and Bartlett, 2023.

Gaddis, Tony. Starting Out with C++: From Control Structures through Objects. 10th ed., Pearson, 2021.

Henderson, Harry. Encyclopedia of Computer Science and Technology. 4th. ed., Facts On File, 2021.

Kidder, Tracy. The Soul of the New Machine. 1981. Back Bay Books, 2000.

Li, Cathy, and Farah Lalani. "The COVID-19 Pandemic Has Changed Education Forever. This Is How." World Economic Forum, 29 Apr. 2020, www.weforum.org/agenda/2020/04/coronavirus-education-global-covid19-online-digital-learning. Accessed 25 Mar. 2021.

Schneider, G. Michael, and Judith L. Gersting. Invitation to Computer Science. 8th ed., Course Technology, 2022.

"Software Developers." US Bureau of Labor Statistics, 17 Apr. 2024, www.bls.gov/ooh/computer-and-information-technology/software-developers.htm. Accessed 5 June 2024.

Sowells, Julia. "8 Different Types of Malware." United States Cybersecurity Magazine, www.uscybersecurity.net/malware. Accessed 25 Mar. 2021.