August 2, 2010

Computer

A computer is a machine with a processor unit that have the ability to execute and store programs. Computer is an electronic circuit that manipulate data in binary form, or bits. This machine can automatically process data or information, according to predefined sequences of instructions, also called programs.

Computer interacts with the environment through devices like the monitor, keyboard, mouse, printer, modem, CD. The computers can be classified according to several criterias (scope, size or architecture).

History of Computing.

In 1936, the publication of the seminal paper of computer science in On Computable Numbers with Application to the Entscheidungsproblem (pdf) year by Alan Turing would kick off the creation of the programmable computer. He presents the Turing machine, the first universal programmable computer, and invented the concepts of programming and program.
Turing published a text of 35 pages where he designed an imaginary machine that can perform any calculation from a binary 0 and 1. He says this will help implement a program unlimited calculations. He suggested that this machine could be real and could be programmed to work at 0 and 1.

The result of the problems addressed is impossible to distinguish the results of a human brain. It was the birth of artificial intelligence, because the calculation is the same, regardless of the object, machine or man, who does. Calculation is a function that is understood by both man and machine. The fact that an operation is simulated by a machine, it is similar to the mechanism, but what changes, what are the mechanisms of "computation" (computer).

Von Neumann opposed Skinner at a seminar on the theme: brain mechanisms in behavior. It is the first to use the word "cybernetics" (comes from "steering" the brain governs, refers to the science of any system capable of self-regulation and communication, as humans). Skinner wants to restore behaviorism. Von Neumann wants to design a program recorded and programmed into the machine: the birth of the first computer. In 1951, organized in Paris by the CNRS, the Conference of the computing machine: the birth of cognitive psychology conceived as computation.
Shortly before the Second World War, came the first electromechanical calculators, built according to the ideas of Alan Turing. The machines were quickly supplanted by the first electronic computers, much more efficient.

The first computer operating in binary language was the Colossus, developed during the Second World War, it was not complete. Turing had worked well in the project. At the end of the war, it was dismantled and hidden due to its strategic importance. ENIAC, commissioned in 1946, the first fully electronic computer completely build based on Turing theory.

The word computer was introduced by IBM France in 1955. Francois Girard, then head of advertising department of the company, had the idea to consult with his former professor of literature in Paris, Jacques Perret, to ask him to suggest a word that characterizes the best what is commonly called a computer.
.
The first multi tasks computer is the Bull Gamma 60 in 1958. The first portable computer is ZX-81 Sinclair, it was sold in kit and had a memory of 1 KB (one kilobyte, 1024 bytes), then come the extension that enabled him to reach 16K , then appeared familly oriented computers like the Apple II in 1977, the Hector in 1981, the Commodore 64 in 1982, Oric-1 in 1983, Apple IIe, then the Amstrad 464.664 6128 ...

Simple Computing

The computers were first used for calculation (integer first, then floating). However, we can categorize them as simple computers: in fact the result of processing a computer can not only be a series of numbers, but also a new program (used by the computer or another). In Von Neumann architecture, data is normalized and can be interpreted either as numbers, instructions, logical values or arbitrarily defined symbol (letter of the alphabet, for example).

The calculation is one of possible applications. In this case, the data are treated as numbers.
The computer is used also for its ability to organize information, among other magnetic storage devices. This ability to organize information has generalized the use of word processing by the general public.Management of relational databases can also retrieve and consolidate distributed information seen by the user as several independent tables.

This creation of a neologism was the cause of multiple translations of expressions Supercomputer or quantum computer. In the latter case, the use of the word "computer" is just overrated because the possibilities to quantum computing are far from the versatility of a "computer".

Experience has learned to distinguish two aspects in a computer, the second of which had been initially underestimated the physical architecture, hardware (aka hardware or hard); software architecture (aka software or software), a computer very technically advanced for its time as the Gamma 60 the company Bull was not as successful as expected, for the simple reason that there was little capacity to implement conveniently its technical possibilities. The software - and its complement services (training, maintenance, ...) - form since the mid-1980s most of the costs of computer equipment, hardware there being a minority.
The computers may be susceptible to EMI bombs.

How a computer Work

Of all the machines invented by humans, the computer is one that is closest to the anthropological concept following:

Body entry. Body of information processing. Body Release
In humans, the organs of entry are the five senses, body treatment is the brain whose software is learning with constant updates during its lifetime, then the output devices are the muscles. For modern computers, the organs of entry are the keyboard and mouse and output devices, screen, printer, DVD burner, etc..

Advanced Computer

The techniques used to manufacture these machines have changed enormously since the 1940s and have become a technology (that is to say, organized around an industrial techniques) full since the 1970s. Many still use the concepts defined by John von Neumann, although this architecture is in decline: the programs do not change over themselves (which would be considered bad programming practice), and the material takes into account this new gives between now significantly storing instructions and data, including the caches.
The architecture of von Neumann's computer broke down into four distinct parts:
the arithmetic logic unit (ALU) or processing unit: its role is to perform basic operations, one much like a calculator;
control unit. This is equivalent to fingers actuating the calculator;
memory that contains both data and program that will tell the control unit to the calculations on that data. The memory is divided into RAM (programs and data during operation) and memory (programs and data base of the machine);
Input-Output: devices that can communicate with the outside world.
UAL and UA

The arithmetic logic unit or ALU is the element that performs the basic operations (addition, subtraction ...), logical operators (AND, OR, NOR, etc..) And comparison operations (for example the comparison of tie between two areas of memory). Is that the ALU performs the calculations of the computer.

The control unit takes its instructions in memory. They tell him what she should order the UAL
and how it may have to act according to the results that it will provide. Once finished, the control unit is going to the following statement or to another statement in which he directs the program to connect.

The control unit facilitates communication between the arithmetic and logic unit, memory and peripherals. It supports most of the execution of instructions in the computer.

Memory

Within the system, the memory can be described as a series of numbered cells, each containing a small amount of information. This information can be used to tell the computer what to do (instructions) or contain data to be processed. In most architectures, it is the same memory that is used for both functions. In massively parallel computers, it even admits that the program instructions are substituted for others during the operation when it results in greater efficiency. This practice was once common, but the requirements of readability of software engineering have been declining, except in this particular case, for several decades.
This memory can be rewritten as many times as necessary. The size of each block of memory and the technology used varied depending on the costs and requirements: 8 bits for telecommunications, 12-bit for instrumentation (DCS) and 60 bits for big scientific computers (Control Data). A consensus was eventually found around byte addressable as a unit and instructions on size 4 or 8 bytes.

In all cases, the byte is addressable, which simplifies writing programs.
The techniques used to make submissions have included electromechanical relays, mercury tubes in which acoustic waves were generated, individual transistors, ferrite toroids and finally integrated circuits include millions of transistors.

Input-Output

The device input / output enables the computer to communicate with the outside. These devices are very important, the keyboard on the screen.

The common point between all input devices is that they convert the information they retrieve from the external data understandable by the computer. Conversely, output devices decode the information provided by the computer to make it understandable by the user.

Computer Bus

These parts are connected by three buses, the address bus, data bus and control bus. A bus is a grouping of a number of electrical conducting a son to carry on the binary information encoded on several bits.

The address bus carries the addresses generated by the CPU (Central Processing Unit) to select a memory cell or an internal register of one of the blocks. The number of bits conveyed by this bus depends on the amount of memory that must be addressed.

The data bus carries data between the various elements of the system.
The control bus carries the different timing signals needed for operating the system: read signal (RD), write signal (WR), select signal (CS: Chip Select).

Architecture

Miniaturization can integrate UAL and control unit within a single integrated circuit known as the microprocessor.

Typically, the memory is located on integrated circuits near the CPU, a part of this memory, the cache can be located on the same integrated circuit as the UAL.

The whole is, on most architectures, complete with a clock speed processor. Of course, we want it to be as fast as possible, but we can not increase his speed without limits for two reasons:
More clock is faster and the processor generates heat (as the square of the frequency). Excessive temperatures may damage the processor; there is a pace where the CPU becomes unstable and it generates errors that most often lead to a crash.

The trend was from 2004 to consolidate several LGUs in the same processor or multiple processors in the same chip. Indeed, the progressive miniaturization (see Moore's Law) permits with little change in cost. Another trend, since 2006, ARM, is to microprocessors without clock: half of the heat dissipation effect is due to clock signals when the microprocessor works, again, without a microprocessor clock is consuming almost zero when it not work: the only clock signal is then necessary for the refreshment of memories. This advantage is important for portable models.
The main functional difference today compared to the model of Von Neumann is the presence on some architectures of two different caches, one for instructions and one for data (whereas the von Neumann model specified a common memory for both) . The reason for this discrepancy is that the modification by a program's instructions is now considered (except for highly parallel machines) as a practice to be avoided. Therefore, if the cache data must be rewritten in main memory when it is amended, known as the instruction cache will never be, hence simplifying circuits and performance gains.

Instructions

The instructions that the computer can understand are not those of human language. The hardware knows just run a limited number of instructions defined. Typical instructions understood by computers are "copy the contents of cell 123 and placed in cell 456", "add the contents of the cell 321 to cell 654 and place the result in cell 777" and " if the contents of the cell 999 is 0, execute the instruction in cell 345 ". But most of the instructions consist of two zones: one showing what to do, called the opcode, and the other indicating where do we call operand.
Within the computer, the instructions correspond to codes - the code for a copy being for example 001. The set of instructions that a computer supports is called its machine language, language that is a sequence of binary digits, because the instructions and data that is understood by the processor (CPU) consists only of 0 (zero) and 1 (one). 0 = electric current does not pass. 1 = electric current passes.

In general, programmers no longer use such language, but go through what we call a high-level language which is then converted into binary language with a special program (compiler or interpreter as needed). The resulting programs are compiled programs understandable by the computer in his native language.

Some programming languages such as assembler are called low-level languages for the instructions they use are very similar to those of the computer. Programs written in these languages are thus very dependent on the platform for which they were developed. The C language, much easier to read than assembler, thus allows programmers to be more productive. For this reason, we have seen more and more used as material costs fell and hourly wages of programmers increased.

Software

Computer software is a long list of instructions executable by a computer. Many programs contain millions of instructions, made some repeatedly. Nowadays, a personal computer running billions of instructions per second.

Since the mid-1960s, computers running multiple programs simultaneously. This is called multitasking. This applies to all computers today.

In fact, each processor performs a heart program at a time, from one program to another whenever necessary. If the speed of the processor is sufficiently large compared to the number of tasks to perform, the user will feel a simultaneous execution of programs. Priorities for the various programs are generally managed by the operating system.

Operating System

The operating system is the central program that contains the basic programs needed for the proper functioning of computer applications. The operating system allocates physical resources of the computer (CPU, memory ...) with various programs running. It also provides tools to software (such as pilots) to facilitate the use of different devices without having to know the physical details.

No comments:

Post a Comment