From Tally Marks to Touchscreens: The Remarkable Journey of Numbers
HistorySomewhere in a cave in southern Africa, roughly 20,000 years ago, someone carved a series of lines into a piece of bone. The Ishango Bone, as we call it today, contains groups of marks that some scientists believe represent numbers—or perhaps a lunar calendar, or even a primitive form of multiplication tables. Whatever the purpose, this humble artifact marks one of humanity's first attempts to represent quantity symbolically.
That bone is now in a museum in Belgium. The numbers on your phone, which descend directly from those ancient scratches, appear instantly when you type. The journey between them spans nearly every major civilization humanity has ever produced.
The First Counting Systems: Tally Marks and Beyond
Before numbers had names, humans counted with what they had. Fingers, obviously. Stones. Sticks. Seashells strung together as a primitive abacus. In many languages, the words for "five" and "hand" share roots to this day.
Tally marks remain the most intuitive system: one line per item, grouped in sets of five (four vertical, one diagonal strike-through). Prison walls still bear testimony to this. So do old accounting ledgers, where clerks would mark each sale with a stroke. The system works, but it's brutally limited. Try representing "one thousand" with tally marks and you'll understand why civilizations eventually innovated.
Some societies jumped directly from tallies to sophisticated systems. The ancient Egyptians used hieroglyphics for numbers as early as 3400 BCE. Their system used strokes for ones, heel-bone shapes for tens, coiled rope for hundreds, and lotus flowers for higher values. Complex? Yes. But it let them count into the millions and record quantities essential for building pyramids and collecting taxes.
Babylon: Cuneiform and the Birth of Place Value
The Babylonians, living in what is now Iraq, developed one of history's most influential number systems—and they did it on clay tablets using a reed stylus pressed into wet mud. Theirs was a base-60 system, which explains why we have 60 minutes in an hour and 360 degrees in a circle. (360 = 6 × 60, if you're wondering.)
Around 1800 BCE, Babylonian mathematicians made a conceptual breakthrough that wouldn't be fully replicated for millennia: place value. The symbol for "1" in the first position meant one. The same symbol in the second position meant sixty. In the third, it meant 3,600. This allowed them to represent any number with just two cuneiform symbols—for "1" and "10"—combined in different arrangements.
A clay tablet called Plimpton 322, dating to 1800 BCE, contains Pythagorean triples—sets of three numbers like 3, 4, 5 that satisfy the Pythagorean theorem—over a thousand years before Pythagoras was born. We don't know who created it, but it suggests Babylonian mathematics was far more advanced than many history books acknowledge.
The Mayan Innovation: Zero and the Long Count
While Europeans struggled without a zero, the Maya in Central America invented it around 350 CE. Their vigesimal (base-20) system used dots, bars, and a distinctive shell symbol for zero. The shell appears in their famous Long Count calendar, which tracked dates in cycles extending millions of years into the past and future.
The Mayans built observatories to track celestial movements with extraordinary precision. Their calendar was more accurate than the one European civilizations used for daily life until the Gregorian reform of 1582. Without zero—a concept so powerful it spread from Maya mathematics to India and eventually transformed Western math entirely—this precision would have been impossible.
India's Gift: The System We Actually Use
The numbers you learned in school, the ones that now appear on every screen and keypad on Earth, came from India. This Hindu-Arabic numeral system emerged around 500 CE, synthesized from earlier Indian systems and spread westward through Islamic civilization.
What made it revolutionary? Several features working together. First, ten digits (0 through 9) with specific shapes. Second, place value—the same digit representing different amounts based on position. Third, zero as an actual number with its own symbol, not just an empty space.
Indian mathematicians like Brahmagupta (7th century CE) and Bhaskara (12th century CE) developed rules for working with zero that seem obvious now but required profound conceptual leaps. Zero isn't just "nothing." It behaves predictably in equations. You can add, subtract, and multiply with it. Dividing by zero creates fascinating mathematical problems that weren't fully resolved until the 19th century.
The Persian mathematician Al-Khwarizmi wrote a treatise around 820 CE that introduced Hindu numerals to the Islamic world. His name gives us the word "algorithm." His book on calculation—Al-Jabr—gives us "algebra." Through Islamic scholarship, these ideas eventually reached Europe, though it took several centuries for merchants and scholars to abandon Roman numerals.
Roman Numerals: Why They Eventually Lost
Roman numerals persisted in Europe for over a thousand years after the Hindu-Arabic system existed. For a long time, this made sense. Roman numerals are readable, if you learn the rules. They can be inscribed in stone with simple tools. They served adequately for recording dates, numbering chapters, and tracking quantities in ledgers.
But multiplication and division? Nightmarish. Try calculating XLIV × VII in Roman numerals. (Answer: 308, if you're curious.) Merchants doing business across cultures needed something better. The Hindu-Arabic system's place value and zero made arithmetic dramatically simpler. Algorithms that took pages in Roman numerals could be completed in lines.
The transition wasn't smooth. In 1299, Florence banned Hindu-Arabic numerals for bookkeeping, fearing they were too easy to alter (turning a 0 into a 6 or 9 is trivial compared to changing V to X). By the 1500s, however, the advantages were undeniable. Even the Catholic Church, no fan of Islamic innovations, accepted the new system because it worked.
The Digital Present: Numbers in the Computer Age
Your smartphone processes numbers in binary—strings of 0s and 1s—because electronic circuits have two states: on and off. The same ten digits we use for everyday math get translated into a different base entirely when you send a text message or load a webpage.
Underneath this transformation lies the same mathematical framework that Babylonian mathematicians would have recognized. Place value works the same way whether you're counting in tens or in twos. Zero represents the same concept. Algorithms for addition, subtraction, multiplication, and division operate on identical principles.
The numbers on your screen are abstractions, really—mathematical objects that behave according to rules developed over thousands of years and across every major civilization. When you tap 7 × 8 = 56, you're using concepts that emerged in ancient caves, were refined on Babylonian clay, flourished in Mayan temples, traveled through Islamic universities, and now live in quantum circuits smaller than a human hair.
Not bad for scratches on a bone.