
When you clasp your hands together you’re holding humanity’s first calculator. Ten fingers, ten physical bits that can be raised or folded: 1 … 10.
That simple trick—mapping “how many” onto “how many fingers are up”—was the first time mind became matter. Everything in computing begins with that moment.
1. Fingers Become Memory
Picture a band of hunter-gatherers tracking days since the last rainfall. Someone scratches a notch into a branch at sundown, one per day. Days stack up, so do notches. The branch is now a persistent extension of the brain: you no longer have to remember the count, you can read it.
Thousands of years later, on the shores of Lake Edward, archaeologists unearth the Ishango Bone, its three columns of grooves frozen mid-calculation.
What you’re looking at is the earliest external state register.

2. Memory Learns to Move
Fast-forward to bustling markets along the Yellow River. Merchants juggle taxes, harvests, debts. Notches are too slow. Enter the abacus: rows of beads that slide. A flick to the left sets a bead “ON”, to the right “OFF”. In one gesture you change a number and see the result. Abacus masters perform compound interest faster than untrained minds can add, foreshadowing a core of computer science: hardware acceleration.

3. Gears Catch the Rhythm of Thought
Now listen to 17-century Paris. Blaise Pascal’s father is an overworked tax clerk; young Blaise watches him slog through sums and decide the beads should turn themselves. He designs a box where digits live on toothed wheels. Whenever one wheel completes ten clicks, a little tooth nudges the next. Carry propagation in bronze.

Gottfried Leibniz soon replaces the wheel with a stepped drum that can advance by any number of teeth, allowing the machine to multiply, divide, even take roots. The drum is an early micro-program: repeat a simple operation at high speed until the answer emerges.

Notice the thread: every new device removes a manual step. Beads remove memorisation; gears remove repetitive addition. Efficiency climbs, but the human still sets every operation.

4. Holes Teach Cloth-and numbers- to obey
Jump to Lyon, 1804. Joseph-Marie Jacquard installs a chain of punched cards on a loom. A hole lets a pin rise; no hole keeps it down. Pins translate directly into a pattern of raised threads. Swap the card deck and the loom waves a new design. For the first time, instructions and data are stored outside the mechanism, ready to be rearranged at will.
Charles Babbage reads about Jacquard cards, imagines gears obeying holes instead of cloth, and sketches the Analytical Engine— steam-powered, card-controlled, and complete with a “mill” (ALU) and “store” (memory). Though never finished, his drawings map almost one-to-one onto modern block diagram of a CPU.

5. “Computer” Means you, until it doesn’t
Throughout the 19th century, the word computer appears in ledgers and job adverts. It describes people—usually women— paid to perform arithmetic all day long. They work in teams, each verifying the other’s columns. Reliability comes from redundancy, not speed. Then Herman Hollerith attaches brass contacts to punched cards, lets electricity flow where holes appear, and tallies the 1890 U.S. Census in record time. The machine is ten times faster than human clerks and essentially never tires or errs. The title “computer” begins it migration from flesh to metal.


6. Ideas overtake mechanisms
1936: Alan Turing pens a thought-experiment, a strip of infinite tape, a read/write head, and a table of symbolic rules. The Turing Machine proves that computation is not gears or electricity; it is logical steps applied to symbols. Hardware is merely one way to enact those steps.

7. Electrons shrink the universe
World War II accelerates everything. The ENIAC fills a ballroom with 18,000 vacuum tubes, each switching like a bead, but a million times per second. Programs are still wired by hand; reconfiguring can take days.

Two years later, inside Bell Labs, Bardeen and Brattain coax a silver of silicon to switch using minuscule charges. The transistor is a bead you can’t see, a gear that never wears, a notch you can flip billions of times without lifting a screwdriver. Vacuum tubes vanish; the word micro joins computer.

8. From hands to hardware: the unbroken line
- Representation — fingers, notches, beads, wheels, holes, electrons.
- Automation — externalising memory leads to externalising procedure.
- Abstraction — every new tool hides the ugliness beneath: beads hide counting rules, punched cards hide gear ratios, high-level languages hide binary.
The story is continuous: each layer grows directly from a frustration with the last too-slow, too fragile, too manual.
Your turn
Try closing the loop.
Hold up your hands. Assign the leftmost finger the value 1, the next 2, doubling each time. With ten fingers you can now represent any number up to 1023 in binary. You’re literally a 10-bit register.
That playful insight is exactly where we pick up in Lesson 2, when we dive into binary arithmetic and the first logic gate you can wire on paper. Keep your fingers limber, we’re about to teach them to sing in zeros and ones.