Posted on Categories Discover Magazine
A small dot on an old piece of birch bark marks one of the biggest events in the history of mathematics. The bark is actually part of an ancient Indian mathematical document known as the Bakhshali manuscript. And the dot is the first known recorded use of the number zero. What’s more, researchers from the University of Oxford recently discovered the document is 500 years older than was previously estimated, dating to the third or fourth century – a breakthrough discovery.
Today, it’s difficult to imagine how you could have mathematics without zero. In a positional number system, such as the decimal system we use now, the location of a digit is really important. Indeed, the real difference between 100 and 1,000,000 is where the digit 1 is located, with the symbol 0 serving as a punctuation mark.
Yet for thousands of years we did without it. The Sumerians of 5,000 BC employed a positional system but without a 0. In some rudimentary form, a symbol or a space was used to distinguish between, for example, 204 and 20000004. But that symbol was never used at the end of a number, so the difference between 5 and 500 had to be determined by context.
What’s more, 0 at the end of a number makes multiplying and dividing by 10 easy, as it does with adding numbers like 9 and 1 together. The invention of zero immensely simplified computations, freeing mathematicians to develop vital mathematical disciplines such as algebra and calculus, and eventually the basis for computers.
Zero’s late arrival was partly a reflection of the negative views some cultures held for the concept of nothing. Western philosophy is plagued with grave misconceptions about nothingness and the mystical powers of language. The fifth century BC Greek thinker Parmenides proclaimed that nothing cannot exist, since to speak of something is to speak of something that exists. This Parmenidean approach kept prominent historical figures busy for a long while.
After the advent of Christianity, religious leaders in Europe argued that since God is in everything that exists, anything that represents nothing must be satanic. In an attempt to save humanity from the devil, they promptly banished zero from existence, though merchants continued secretly to use it.
By contrast, in Buddhism the concept of nothingness is not only devoid of any demonic possessions but is actually a central idea worthy of much study en route to nirvana. With such a mindset, having a mathematical representation for nothing was, well, nothing to fret over. In fact, the English word “zero” is originally derived from the Hindi “sunyata”, which means nothingness and is a central concept in Buddhism.
Bodleian Libraries
So after zero finally emerged in ancient India, it took almost 1,000 years to set root in Europe, much longer than in China or the Middle East. In 1200 AD, the Italian mathematician Fibonacci, who brought the decimal system to Europe, wrote that:
The method of the Indians surpasses any known method to compute. It’s a marvellous method. They do their computations using nine figures and the symbol zero.
This superior method of computation, clearly reminiscent of our modern one, freed mathematicians from tediously simple calculations, and enabled them to tackle more complicated problems and study the general properties of numbers. For example, it led to the work of the seventh century Indian mathematician and astronomer Brahmagupta, considered to be the beginning of modern algebra.
The Indian method is so powerful because it means you can draw up simple rules for doing calculations. Just imagine trying to explain long addition without a symbol for zero. There would be too many exceptions to any rule. The ninth century Persian mathematician Al-Khwarizmi was the first to meticulously note and exploit these arithmetic instructions, which would eventually make the abacus obsolete.
Such mechanical sets of instructions illustrated that portions of mathematics could be automated. And this would eventually lead to the development of modern computers. In fact, the word “algorithm” to describe a set of simple instructions is derived from the name “Al-Khwarizmi”.
The invention of zero also created a new, more accurate way to describe fractions. Adding zeros at the end of a number increases its magnitude, with the help of a decimal point, adding zeros at the beginning decreases its magnitude. Placing infinitely many digits to the right of the decimal point corresponds to infinite precision. That kind of precision was exactly what 17th century thinkers Isaac Newton and Gottfried Leibniz needed to develop calculus, the study of continuous change.
And so algebra, algorithms, and calculus, three pillars of modern mathematics, are all the result of a notation for nothing. Mathematics is a science of invisible entities that we can only understand by writing them down. India, by adding zero to the positional number system, unleashed the true power of numbers, advancing mathematics from infancy to adolescence, and from rudimentary toward its current sophistication.
This article was originally published on The Conversation. Read the original article.