How Does a Computer Work?
Well, they would say things like a computer does not understand human language rather understands exclusively the mathematical language of binary system or simply on and off or true and false. Well, pretty much that’s it! But what exactly do you understand by that?
I am not an expert on computers. I have simply tried to explain what I have understood about the basic mechanism of computers in simple language (eliminating large parts of technical details). If you are an expert on computers, then your criticism is welcome. Please help me and others understand more about this.
If you have been to a school where computer science was taught, then you must be knowing that ‘a computer’ used to mean a person trained in mathematics who performed mathematical calculations (i.e. compute) in the seventeenth or eighteenth century. The word bore the same meaning until the middle of the twentieth century.
Why did people need computers?
Well, we all know that mathematics is not everyone’s cup of tea (it should be though). Not everyone can quickly perform even a simple task of addition precisely. And even the smartest of people need a (relatively) long time to multiply say two numbers (numbers, wow!) each with fifteen digits. To reduce the time of calculation, the computer software improvements and produce results with precision, a machine was needed.
There are several numeral systems that evolved in different parts of the world. The Hindu-Arabic numeral system is the most common system for the representation of numbers. A numeral is a symbol, for instance, ‘9’ which represents a number. But a number actually, does not exist! A number exists only in our minds. ‘1’ human being has ‘2’ hands with ‘5’ fingers in each. All of these numbers represent forms. One human-like form has two hands-like forms each of which has five fingers-like forms.
More than two thousand years ago, centuries before the adoption of the numeral system we use today, humans had abacus to calculate numbers, i.e. to perform simple mathematical operations.
You can check the following video to see how an abacus works (for addition and subtraction.)
Many devices evolved with time. Slide rule, Pascaline, Leibniz’s Calculating Machine, and Charles Babbages’ Difference Engine are among the notable ones.
The need to compute did not limit itself to arithmetic, and human desire to expand the use of machines in computation increased to its pinnacle in desperate times of war in the twentieth century.
The Transformation from Analog to Digital
Let us take Pascaline to understand the working of an analog calculator.
A look at Pascaline reminds me of the odometer of my motorcycle. Odometer measures the distance traveled by a vehicle.
Pascaline performed addition and subtraction. Multiplication is repeated addition and division is repeated subtraction. You can check out the working of a Pascaline in the following video.
Well, the odometer of vehicles also can perform addition. If a car is driven for say thirty-two kilometers, then the odometer records 32 (which also is the ‘memory’ of the meter, it has remembered what it has recorded). If it further runs thirteen kilometers, then the odometer will record 45 (which is adding 13 to 32, and overwriting of ‘memory’).
This is pretty much how analog devices work. If you want to increase the scope of its workings, then you need to add more wheels, axles, and other mechanical components.
Now, digital devices, on the other hand, make use of just two numbers 0 and 1. It must be noted that the numbers exist only in our minds. 0, in reality, may represent ‘off’ or ‘false’ or ‘low voltage’ while 1 may represent ‘on’ or ‘true’ or ‘high voltage’.
Early digital computers were electromechanical. Data entry was done manually through rotary switches. [In odometer, data is entered by rotating wheels, in Pascaline data was entered with hands.]
As we saw in the case of a Pascaline or in a motorbike’s odometer, they can perform only specific operations with the numbers because of their design. But, for example, a Jacquard’s loom can weave various patterns as brocades on textiles. Because Jacquard’s loom can be programmed differently with the use of punched cards.
Jacquard’s loom is a small machine with limited capabilities compared to Harvard Mark I computer. The complicated machinery had an ability to process the entered data, but how that data should be processed was instructed by programs in the form of punched program tape. Just like how some brocade to be incorporated was dictated by the punched card in Jacquard’s loom.
The replacement of electromagnetic switches used in Harvard Mark I with vacuum tubes gave rise to ENIAC, the first electronic general-purpose digital computer. These computers primarily dealt with different sorts of mathematical operations.
A Transistor
A transistor is a semiconductor device used to amplify or switch electronic signals and electrical power. A conductor is a type of material like metals that conducts electricity with very little resistance whereas an insulator like wood is a material that has a very high resistance to the flow of electricity, i.e. does not allow the flow of electricity. Whereas a semiconductor like silicon falls in between conductor and insulator. The conducting properties of these semiconductors can be altered by introducing impurities. These semiconductors are used to compose transistors.
These transistors are used as electronic switches thus replacing electromagnetic switches like used in Mark I or vacuum tubes used in ENIAC. Thus, greatly reducing the size of a computer and increasing its speed and scope.
These transistors are used to construct logic gates (based on Boolean Algebra). Being very small in size these also consume very little power. These transistors are used to construct CPUs.
In the picture above, the symbolic representation of different kinds of logic gates and their behavior in the table just below each symbol can be seen. A and B indicate the input terminals. For instance, we can see that the output is turned on in AND gate only if both the terminals are turned on. And so on. These logic gates build the basis for binary computation.
Now, Let’s Skip to How Modern Computers Work
We all know that the Central Processing Unit (CPU) is the heart and brain of a computer. There are several units designed in the CPU to perform specific actions, for example, send/ receive signals to RAM. There is a complex electronic circuitry in the CPU made of hundreds of millions of transistors and accompanying circuits.
If you remember Pascaline, then different interconnected wheels and other mechanical parts were designed to work together so as to perform simple mathematical operations. Pascaline itself did not know of the calculation. If modern computers are to be compared with Pascaline then the transistors are analogous to those wheels and other circuitry to axles and other parts.
The transistors composing logic gates function to transmit or block electric current like a wheel of Pascaline caused an adjacent wheel to rotate after one complete rotation (a piece of ‘information’ is flowing from one wheel to another).
When a user presses say ‘a’ on the keyboard, then the changes of current in the specific circuitry designed for ‘a’ under its key flows to the CPU where it brings about changes of electric currents there from where it is relayed to other parts of the computer. Say, if a user presses ‘1’ + ‘2’ on the keyboard then the changes these bring are processed accordingly in the CPU.
The point I intend to make here, which troubled me earlier is that it is not the inherent intelligence of CPU (of course) due to which it performs actions. The electrical currents are rather sent to the CPU where hundreds of millions of transistors and accompanying circuits lie which then furthers the altered currents (processed currents) finally to the output which are then made readable. The Pascaline, for example, performed the calculations and the calculations were then made human-readable.
One of the basic factors in the design of CPU is that it is made to work in the binary system with the help of logic gates. The numbers, texts, images, sounds, and all other kinds of data are transformed into on and off of electricity just like at the receptors of neurons where all kinds of stimuli are converted into electrochemical information. If a person pinches you then the pressure causes the release of some chemicals at nerve endings which then changes the electrical state of neurons and then the information is carried along the neurons. If you witness a beautiful scene, then the light rays from the scenery stimulate your retina to convert those into electrochemical signals. Similarly, any kind of a single data first must be coded with a specific binary representation which then is designed on the input devices (by on and off states). What should be done with data by CPU is controlled by programs like punched cards in Jacquard’s loom.
To precisely understand the vast science of computers is not possible with a single blog post, obviously. The integral components in the working of a computer like operating systems and software are not discussed here at all (about which I know, of course, a very little). The languages which are designed to communicate with the computers, the bits, bytes, and all those are ignored in bulk. My aim was just to facilitate the understanding of ONE basic thing – how a modern computer is said to calculate without bearing inherent intelligence. I hope this post confuses you with a reflection that… how vast computer science is.
Good post. I learn something totally new and challenging on blogs I stumbleupon on a daily basis. Its always useful to read content from other authors and practice something from their websites.