Article contributed by Charles Petzold

How does a computer add?

It’s shocking how so many experienced programmers can’t answer that question. Not exactly, anyway.

Of course, everybody knows how to add, and anybody with some programming experience can write a little computer program to add two numbers together. Addition is the most basic of arithmetical operations, and every programming language lets you add with a very simple statement such as C = A + B. What can be easier?

Yes, but how does the computer add those two numbers?

Most programmers use high-level languages such as C++ or JavaScript or Python that are not directly understood by the computer. A compiler or interpreter must translate statements such as C = A + B into machine code that is executed by the central processing unit (CPU) of the computer. CPUs generally implement storage areas called registers to store numbers. Machine code instructions move numbers from memory to registers, other instructions add those numbers, and still others move the results back to memory.

But how does the CPU add those two numbers?

A CPU contains a component called the Arithmetic Logic Unit (ALU) that is responsible for basic arithmetic operations such as addition and subtraction. The CPU inputs the two numbers to be added into the ALU and the output of the ALU is the sum.

Yes, but how does the ALU add those two numbers?

Ahh, now we’re getting much closer to the answer to this perplexing problem. The CPU is composed of transistors. These transistors are switches much like the light switches in your home. Flip the switch one way to turn the light on, and the other way to turn it off. But the crucial difference with a transistor is that it can be turned on or off by another transistor, and that transistor can turn another transistor on or off.

Pairs of transistors can be wired together in various ways to build logic gates. If you wire two transistors one after another, you can create a circuit that implements the logical operation known as AND: The output is on only if both transistors are on. A pair of transistors wired side-by-side in parallel implements another basic logical operation called OR: The output is on if one or the other of the two transistors are on. A transistor can even be wired so that the output is the opposite of the input. This is known as NOT. The opposite of an OR logic gate is called NOR, and the opposite of an AND logic gate is NAND. These basic logic gates can be combined to perform more complex operations.

From these humble beginnings, an entire digital computer can be built, including the ALU that adds two numbers together.

Every number in the CPU is a series of bits. The sum of two bits A and B is calculated as (A NAND B) AND (A OR B), so three basic logic gates are required to add two bits together. But each addition also results in a possible carry, and that bit is calculated as A AND B, so another logic gate is required. For each subsequent pair of bits, the carry from the previous addition must be added in, requiring even more logic gates. Adding 8-bit, 16-bit, or 32-bit numbers requires lots of logic gates, but fortunately they’ve been shrunk down to microscopic size.

And that is how a computer adds.

But do you really need to know this? Is it really necessary to know what’s going on inside the CPU?

Some people think not. They believe that the computer should be treated as a ‘black box’ and that programming languages should be regarded as abstract representations of tasks for the computer to perform.

I disagree. The less we know about the inner mechanisms of computers, the closer we come to treating the computer as if it’s ‘magic’ of some sort. It’s not magic. Bits are very real, and the electronic manipulation of bits is at the core of these marvellous devices.

I believe that we can achieve a much deeper understanding of computers by peeling away the layers that separate us from those inner mechanisms. And the more you know about the inner life of computers, the more comfortable you’ll feel telling the computer what to do and how to do it.

And what’s more: If someone asks you how a computer adds, you’ll be able to tell them!

Charles Petzold, author of Code: The Hidden Language of Computer Hardware and Software, 2nd edition, has been writing about computers and programming for almost 40 years. He is the author of the classic Programming Windows series and other books about Windows programming, as well as The Annotated Turing: A Guided Tour through Alan Turing’s Historic Paper on Computability and the Turing Machine.

For more information, please visit CodeHiddenLanguage.com.

Tags: