Test Preparation and Resources for Applied Computer ScienceComputer Science involves the systematic study of the structure, mechanics, expression, and feasibility of the algorithms underlying the receiving, storing, processing, communicating, and accessing of information. This information may be stored within computer programs. Computer scientists explore the question of how computational processes may be automated and implemented to access valuable data from various sources.
Computer Science was established as an academic discipline in the 1950-1960s. The first Computer Science degree program began at the University of Cambridge Computer Laboratory (England) in 1953. However, the first computer science degree in the US did not begin until 1962 at Purdue University. Computer science broadly encompasses a vast number of fields that use and compare algorithms to determine effective solutions to specific problems. Theoretical computer science has been described as the supporting base of the computing field. Theoretical computer science is involved in the mathematical solutions involved in the design, use, maintenance of computer software systems, and understanding information in digital form. Applied Computer Science is using the knowledge of computer science and the understanding of complex computing techniques to find solutions to problems in other disciplines. In addition to using computing in pursuing answers to complex questions, applied computer science requires the combined knowledge of computing and other scientific disciplines to create programs and solutions to the questions often uncovered in the course of research.
As early as 1940, Alan Turning questioned the ability of computers to ‘think’. According to the Association for the Advancement of Artificial Intelligence, artificial intelligence is the embodiment of the scientific understanding of human intelligence and underlying though processes within a machine (computer). The branch of applied computer science that is involved with the development of artificial intelligence works to synthesize the processes of decision making, problem solving, adapting to environment and communication challenges that humans possess and replicate it within a computerized system. Artificial intelligence research uses a cross-disciplinary approach within such fields as symbolic logic, applied mathematics, semiotics, electrical engineering, neurophysiology, philosophy, social intelligence, and neurophysiology. Popular science fiction portrays artificial intelligence within robots capable of taking over civilization. The more practical application of artificial intelligence is in the area of software development regarding understanding of finances, economics, and the physical sciences.
- What is Artificial Intelligence? -This is an explanation for artificial intelligence from a historical perspective.
- Broad Discussions of Artificial Intelligence – The Association for the Advancement of Artificial has compiled a collection of discussion concerning artificial intelligence here on their site. Also included is a list of recommended reading for more information.
- The Turing Test – In this paper you will find a discussion of the Turing Test as a means to determine if a computer can think in the way humans use cognition. Included are several examples of the test and a discussion on the variations and standards for passing the test.
- What does the Turing Test Really Mean? – The author of this essay concerning the meaning behind the Turing Test discusses how the question of artificial intelligence ‘passing’ as human intelligence may have been influenced by his questions of fitting in or ‘passing’ as ‘normal’ among his peers.
- Artificial Intelligence: Realizing the Ultimate Promises of Computing – The quest for artificial intelligence has led to the use of robotics in industry and pathology diagnosis systems that function with added levels of knowledge and reasoning to applications in a variety of SMART technology innovations. This is a discussion for potential future use of advances in artificial intelligence research.
Computer Architecture and Engineering
Computer architecture and engineering involves designing hardware and organizing the system or devices to meet specific goals, functions, and capabilities using the latest technologies. Computer architecture and engineering uses technology trends such as power in ICs, scaling of transmitters and receivers, bandwidth over latency, dependability, and cost to design a system to meet specific parameters. Computer architects and engineers work on challenges involved in finding power-efficient ways to improve scaling performance. This may involve circuit technologies, processor designs, execution strategies, storage system organization, micro-architectures, and unique abstract programming.
- What is Computer Architecture? – In this overview of Computer Architecture the technology and science is examined and compared to building architecture before going on to discuss the influence of technology.
- Computer Architecture and Systems - Electronic and computer architecture research continues to bring advances to this area of Applied Computer Science. This page offers examples of Computer Architecture Research conducted at NC State University.
- Computer Architecture Summary – This is a Power Point display summarizing the key components of Computer Architecture including fundamentals and limitations.
- Computer Architecture: The Language of the machine – A review of the components that go into Computer Architecture including the various instruction sets, memory types, and instruction set design is found in this presentation.
- Computer Architecture and Engineering Curriculum Model – This an overview of the curriculum guidelines for Computer Architecture established by the Joint Task Force on Computing Curricula 2001.
Computer Graphics and Visualization
Over half of the human brain involves the visual cortex. This equips us to depend on vision to understand much of the information we process in our daily interactions. The most effective way we can communicate and connect with computer-generated information is using the bandwidth connections that are an innate part of our complex visual system. The use of graphics has come a long way from the earliest stick sketches to modern computer visualization. While both graphics and visualization involve transforming data into images, graphics involves rendering information into pixels, shading and so forth while visualization involves mapping constructs that can then be rendered into graphics for various displays. The line between graphics and visualization is often blurred and at times may be used interchangeably.
- Parallelism in Graphics and Visualization –This Power Point slideshow examines the parallelism in graphics and visualization.
- Attention & Visual Memory – Research explored in this article on visual perception and attention places an emphasis on how visual attention and memory relates to the use of graphics.
- Algorithms for Computer Graphics and Visualization – Here you will find a discussion on the issues, unresolved problems, and ongoing research needed for computer graphic algorithms especially related to #D and $D technology.
- Research Issues in Visualization –This report examinations existing visualization techniques and looks at possible future vector field visualizations.
- Visualization & Graphics: Volumetric Displays – This research explores the use of a new technique that broadens the use of volumetric displays to allow insight into 3D objects.
Computer security and cryptography
Cryptography is the segment of cryptology involved with designing systems for coding and decoding data to ensure computer security. Cryptography relies on a combination of ideas from Numbers Theory and Theoretical Computer Science to design computer security systems. Cryptography allows encoded messages less likely to be read by unauthorized users. Cryptography can be used to design weak or extremely strong cipher text. The strength is determined by how long it takes someone to decode encrypted messages.
- What is Cryptology? – A descriptive explanation of cryptology and a brief look at the history behind its use in securing information.
- The Basics of Cryptography – This is a good basic overview of encryption and decryption. There is also a discussion in the strength of various encryption codes.
- Coding Theory & Cryptography – Here is a discussion of Coding theory as it related to cryptology and the challenges of privacy, security, and confidentiality of information over insecure networks or channels.
- Cryptography & Computer Security – This paper presents an overview of predominant encryption algorithms.
- Analysis of Data Encryption Algorithms – A comparison of the performance between encryption algorithms commonly used in the C# language.
Computational Science uses advanced computing technology and systems to examine, understand, and solve a variety of complex problems. Computational science is quickly becoming one of the important fields involved in the advancement of modern society. Using computational science researchers are able to investigate the processes of the human brain, analyze the spread of infectious diseases, support advanced industrial technology, and more.
- What is Computational Science? – This page offers a description of Computational Science and its use in designing realistic models to explore real world systems.
- Computational Science and America’s Competitiveness – In this report, the importance of computational science regarding America’s competiveness in global technology is discussed.
- Computational Science Sparks Innovation – Here is a look at several innovations sparked by computational science developed by George Schatz.
- Master of Science in Computation in the Sciences – This page explains the core requirements for a Master of Science in Computation in Sciences at Valparaiso University.
- A Survey of the Practice of Computational Science – In an effort to understand the practice of computational science from both scientific and engineering perspectives this paper surveys scientists from a variety of practices who use computational programming as part of their research.
Information science is inherently connected to information technology. As information continues to evolve and expand the definition for information science also changes. The perhaps over-simplified explanation of information science is the study of collection, organization, storage, retrieval, and dispersing of information. Information science involves components of both pure science and applied science. On a deeper level, information science also involves aspects of multiple disciplines such as education, journalism, and communication research.
- What is Information Science? –This is a collection of papers that help provide an answer to those wondering what exactly is information science and how is it used in today’s society.
- Information Science – The author of this essay examines information science as a professional practice and field of scientific study.
- Invisible Substrates of Information Science – In this report, the author explores aspects of information Science as a substrate within other disciplines.
- What Kind of Science Can Information Science Be? – An examination of what is involved in or included within the field of information science is discussed in this paper. The author examines the confusion surrounding the use of information science to refer to various disciplines and fields and calls for clarification and the understanding of information science within context.
- Information Science’s Contribution – A general overview of the some important contributions information science has made worldwide.
Software engineering uses a disciplined and organized approach to the development and delivery of computer software. Computer engineering involves all aspects of software development and production depending upon advanced technology to solve problems. In contrast to computer science, which is focused on theories and fundamentals, software engineering focuses on the development and delivery of practical software solutions. Software engineering is a part of systems engineering that provides applications, controls, and databases within the hardware system.
- Introduction: Software Engineering – This Power Point presentation introduces software engineering including a look at ethical and professional issues.
- What Does Software Engineering Involve? – This paper provides a brief explanation of what is involved within the field of software engineering.
- Career Focus: Software Engineering – According to this report, software engineering is expected to continue growing in demand in a number of fields such as environmental engineering, economics, electric vehicle mechanics and more.
- A Career in Software Engineering – This brochure published by Griffith University summarizes career opportunities in software engineering.
- Software Engineering Process – The purpose of this presentation is to introduce core concepts of software engineering and their place within a team approach to product development.