Reality → Tech → Computers → How a computer works
How a computer works
A bit is physically represented by the voltage pulse that changes the state of a transistor: '0', or 'off', is commonly represented by a voltage
smaller than 0.4 V, and '1', or 'on', by a voltage of 2.0 - 2.4 V. The two distinct voltage bands allow secure, error-free representation and processing
of data (text, graphs, sound, and video). All data and instructions are transferred to the central processing unit in the form of streaming
patterns of electrical pulses. Today's computers are still based on the ideas of two 'fathers of computer science': in the 1930s,
Turing described a highly theoretical model of a
universal computer; ten years
later, von Neumann described the concept
('architecture') of a workable computer that would store data and instructions
in memory. In modern computers, executable instructions, as well as data, are coded in
machine language. Processor and memory exchange myriads of electric pulses at the beat of billion clock ticks per second, switching microscopic
transistors and charging/discharging microscopic capacitors, with meaning transferred by the programmer's coding. The following concepts, procedures,
and hardware components are part of a multitude of ingenious ideas and electronic gadgetry that make a computer work:
- Instruction set. Fundamental processor-specific instructions in machine code
are recognized and executed by the CPU. The instruction set specifies the most
basic operations (such as read, write, move, compare individual bytes) and describes structures and procedures (e.g.,
memory layout,
registers, interrupts, recognized
data types, addressing modes) that must be used by compilers and assemblers.
- Processor.
Control unit (CU) and arithmetic logic unit (ALU) direct process flow and
execute computations. The CU fetches and decodes instructions, and directs the flow of data for processing and storage. The ALU executes
math operations of binary numbers and logical switching called for by conditional expressions. Billions of electric signals (bits and bytes) are
processed per second.
- Random access memory. RAM provides temporary storage and fast access for
data and instructions. Processor and memory are connected by a fast bus, allowing rapid
exchange of addresses, data, and commands. For extra speedy access to frequently used data a cache is built into the
processor. Buffering of larger amounts of less used data is made possible through paging
and creating virtual memory on the hard disk.
- Registers. Built-in registers of the processor further accelerate
processing. Registers are small (word-sized) memory cells directly connected
with the arithmetic logical unit. Even faster than cache, they, inter alia, temporarily hold memory
addresses, help separating data and instructions, and keep track of program execution.
- Stacks. Software-generated stacks support efficient processing
of subroutines. Stacks work according to the last-in first-out (LIFO) rule and are used
(mainly automatically) for translating high-level code and depositing addresses. Special registers
(stack pointers) allow retrieval of any stack layer.
- Interrupts. Software- or hardware-induced signals stop and redirect the flow of
programs. Interrupts can be raised externally (e.g., by keyboard or mouse) or internally (e.g., deliberately programmed or caused by program error).
- Pipelining. The method splits instructions into sequential fundamental
steps (such as fetch, decode, execute, access memory, and write back) and then processes these steps from several instructions in parallel rather
than executing the full step sequence of each instruction one by one.
← Computers