The origins of computing can be found in a recurring need that people had in different contexts: the need to do calculations as precisely as possible. Calculations are present in our daily life and are necessary for many things: commerce, accounting for private companies, collecting taxes, statistics in the research field, drawing buildings plans, etc. The world would not function in the same way without mathematics and humans have always been looking for new ways to optimize calculation processes to respond to this need.
The first calculators
In this context, thanks to the advances in the field of mechanics, the first calculators appeared in the mid-17th century as an evolution of the traditional Chinese abacus, a calculation tool invented in 500 BC to help people do math operations.
The first mechanical calculator was created in 1623 by Wilhelm Schickard. This device used a system of gears that allowed additions, subtractions and multiplications, and had a system that allowed the process of these calculations to be recorded, as if they were a memory. A few years later, in 1645, Blaise Pascal‘s famous adding machine (The Pascaline) would also appear, and during the following years many other mathematicians would try to surpass it.
Among others, Leibniz’s machine should be highlighted, since at the end of the 17th century he would establish the foundations of the binary system that would start to be used almost 300 years later with the arrival of the first electronic computers.
Parallel to the evolution of calculators, which had a very accelerated pace throughout the 19th century, other calculation tools emerged to respond to more specific needs, such as the prediction of tides. At the end of the 19th century, the first predictive machines were built: they were named differential analyzers, could solve differential equations and responded to needs related to the military field.
As in the case of calculators, these devices also evolved over time, from mechanical differential analyzers, used mainly during the last two decades of the 19th century and the first two decades of the 20th century, to non-mechanical analyzers, which appeared in 1920 with the integration of electrical circuits. The first electromechanical differential analyzers attracted the attention of the military, who saw it as an opportunity to calculate the trajectory of their projectiles or the exact location of their targets. However, beyond their use during the war, they were not practical devices because they were too large and noisy.
The forerunners of computers
The history of digital calculators or computers begins at the end of the 1930s, although it starts from earlier ideas such as the Jacquard loom, considered one of the precursor machines of computers. Invented by Joseph-Marie Jacquard at the beginning of the 19th century, this machine could be connected to a loom and used punched cards to “program” patterns that were then woven onto the fabric. This idea from the textile industry was applied to all kinds of mechanical devices and inspired other creations as important as Charles Babbage‘s analytical engine or the token sorting machine that Herman Hollerith proposed as an administrative solution to process the census of population of the United States that would take place in 1890.
Charles Babbage was a scientist very interested in being able to do calculations automatically. He began to investigate how he could create a machine capable of doing these calculations by the method of differences, which converted any analytic function into different addition and subtraction operations. The machine was called the difference engine and he presented two versions, one in 1822 and a second ten years later, but the budget needed to build it was too high and the British government was not interested enough to subsidize it
During the process of creating the difference engine, Babbage also designed another machine that would have been more economical, but the project did not go ahead either. It was the analytical engine, capable of performing any calculation that was indicated to it by a program that used the punched card system. Thus, this machine was the design of a computer but, as has often happened throughout history, the idea of this scientist was too advanced for the technology of that time and could not materialize. The analytical machine had been designed only to perform calculations, but Ada Lovelace, who had started working with Babbage and is considered the first programmer in history because she managed to introduce an algorithm into this machine, understood that the machine could go further and its possibilities were not limited to doing math, as indicated in some of her notes.
Herman Hollerith was an American statistician who had been hired to respond to an administrative need related to the difficulty in processing the population census. His solution was the design of a tabulating machine and a system of cards that greatly expedited the process and that at the beginning of the 20th century gave rise to the commercialization of many similar machines (tabulators, classifiers, verifiers, registers, etc.) and the creation of a large company that had a monopoly for these machines (which would later become IBM, with Thomas Watson as CEO). Business emerged even stronger from the crash of 1929 and the policies implemented during the 1930s, and eventually tabulators made the leap from the accounting area to the scientific realm, where calculations were also essential. In this way, if Babbage had indicated the path in the techno-scientific field, Hollerith was doing so at the business level.
The birth of computers
The technological advances that occurred during the first half of the 20th century gave a final push to the idea of a universal machine that Babbage had begun to imagine. Inventions such as the telephone were responsible for this situation: for example, the calculation needs of the well-known Bell Labs accelerated the evolution of calculators and some terms introduced by Babbage’s machine, such as programs and program libraries, began to be used. While at Bell Labs this line of work was led by the mathematician George Stibitz, in Nazi Germany in 1941, Konrad Zuse also followed a similar path that led him to invent the Z3, the first universal programmable calculator.
Finally, IBM’s work with the collaboration of Harvard led to the creation of the Mark I, publicly presented in 1944, which was the first electromechanical computer and the last forerunner of modern electronic computers. In this project we can observe the intention to build a universal machine, following the line marked by Stibitz, with the bases previously established by Babbage and an operation that was an evolution of the previous machines and still had as its main goal to respond to calculation needs.
We must not forget that even the launch of the ENIAC (1946), considered the first general purpose computer in history, still pursued the same purpose as the first calculators, but taking advantage of all the technological advances: its designers’ goal was to build a desktop calculator which had the ability to perform much faster calculations (specifically, they wanted it to do calculations on the trajectory of bullets and artillery during World War II) and that was electronics instead of mechanics. What they built, however, was a true revolution.