Analogue computers have long been considered outdated technology. They used to be the pinnacle of computing until the 1970s, after which digital computing came to the fore. The results since then speak for themselves. Almost every device we use in our daily lives today starting from smartphones to toothbrushes has digital computing devices on board. Digital computers are simply unmatched at general computing and solve so many of our problems practically. Yet, of late, our attention has started moving back towards analogue computers. We are now trying to revive the progress from the early 1970s with the development of analogue computer technology. We see both scientific researchers as well as technology start-ups investing time and resources into developing the analogue computer technology of the future. Why though? In this article, I try to answer that very question. But in order to get to the âwhyâ, we need to cover a few things first. Iâll start with the origins of analogue computing.
This essay is supported by Generatebg
The Origins of Analogue Computers
In 1901, sea divers discovered an ancient shipwreck off the coast of a tiny island called Antikythera in Greece. Among the recovered treasure was what came to be known as the oldest known example of an analogue computer in human history! It was the Antikythera mechanism. The Antikythera mechanism is an ancient Greek hand-powered model of our solar system. It was used to compute and predict the relative positions and motions of the planets and moons, thereby also predicting eclipses decades in advance. What drove the ancients to build such a device?
Reasoning by analogy seems to be a fundamental human trait. Whenever human beings observed patterns in phenomena, they tried to reduce the patterns to smaller models that were then used to predict future patterns. It is no coincidence that children play with small toys that represent bigger real-world objects. Human beings seem to have the innate need and ability to reduce phenomena out of their reach into smaller toy models that they can reach and play with.
Analogue Computers Are Hidden in Plain Sight
When we fast forward to today, we see digital computing devices everywhere. But what we take for granted are the analogue computing devices that are hidden in plain sight. In its own right, the thermometer that is used to measure body temperature can be considered an analogue computer. This analogue computer essentially reduces our sensation of temperature into a model that moves a liquid along a scale with measurable numbers next to it. The conventional speedometer used in cars and other vehicles converts the speed of rotating physical objects into a numerical reading that gives the speed in kilometres per hour or something similar. This analogue computer uses complex mechanisms to reduce our sense of speed into a measurable quantity.
The Seismometer is yet another analogue computer that reduces earthâs movements into a measurable model of earthquakes. Other examples of analogue computers include medical devices (used to measure and predict human body behaviour), voltmeters (used to measure electric voltage), etc. Basically, any measurable physical quantity can be processed and reduced into an analogue model by using an analogue computer.
How Are Analaogue Computers Different from Digital Computers?
Digital computers work on discrete input and produce discrete output. This is a fancy way of saying that they take in symbols and produce symbols (such as 1s and 0s) as output. Analogue computers take in continuous information as input and produce continuous output (usually over time). Think about the speedometer. It takes the motion of a rotating object as an input and produces a moving dial as the output.
Digital computers have a fixed architecture that runs variable algorithms called programs to solve problems. This makes digital computers great for solving general problems. We can essentially use the same architecture and solve different problems. Think about your smartphone. You can use the same device to stream videos as well as perform mathematical calculations (among other things). On the other hand, analogue computers are not general purpose. They are very good at solving one particular problem. Think about the speedometer. It cannot be used to solve any other problem than measuring speed. The advantage of analogue computers however is that they are very fast at what they do whilst being resource-efficient. One of the big drawbacks of analogue computers is that by the nature of their construction, they are not perfect. Such imperfections lead to output inconsistencies. This means that the same input may lead to slightly different outputs; all it takes is a loose wire or a mildly chipped mechanical gear to be off by a few millimetres.
Why Are Digital Computers Not the Future?
If digital computing is more general-purpose and is more precise than analogue computers, why are digital computers not the future? The short answer to this question is that we are fast approaching the limits of digital computing. What started as the single most defining master thesis of the past century from the genius, Claude Shanon has taken us really far. But the ride is coming to a slow end. Shanon basically invented the idea of applying Boolean algebra to process information and this paved way for the mathematical treatment of general information (digitally). Eventually, as technology developed, we were able to build general-purpose digital architectures using transistors packed into integrated circuits.
Over the years, we kept developing smaller and smaller transistors, which led to faster and faster digital computers. This phenomenon was captured by Mooreâs Law, which states the number of transistors in a dense integrated circuit doubles about every two years. The odd thing about Mooreâs law is that it is no law. It is merely an empirical observation that has been extrapolated into the future. Our latest transistor architectures are so dense, that we are reaching the size of atoms. Beyond this point, we will be facing physical limitations to develop denser circuits (how do we pack stuff closer than atoms can be packed?). In short, digital computers of the future may not get much faster compared to today. But why is faster computing important? That question directly leads us to our core topic.
Why Are Analogue Computers Considered the Future?
Some of the most challenging real-world phenomena we are trying to understand and predict involve differential equations. For instance, let us consider fluid flow. This is described by a set of equations known as Navier-Stokes equations. These equations do not have an analytical solution and have to be numerically solved using computers. Similar (partial) differential equations describe most of the real-world phenomena we are trying to understand and predict currently, like tidal predictions, weather predictions, etc. When we combine this requirement with the staggering growth of machine learning methods, we are in dire need of faster and better computing. Whenever it comes to solving specific differential equations, analogue computers are still unmatched by digital computers.
Ever since Claude Shanon wrote his master thesis, almost the entire human race jumped onto the digital computing bandwagon. Now that we are facing the limits of digital computing, our attention is slowly moving towards analogue computing that we abandoned around the 1970s. The question then becomes: could we come up with viable general-purpose analogue computers, where the computer architecture can instantaneously be rebuilt to solve different problems as and when required (remember: with analogue, the computer is the program)? Or is a hybrid approach of combining digital and analogue computing an option as well? Scientific researchers and tech start-ups are on the case in this field. It is generally considered that analogue computing has more potential. This is why analogue computers are suddenly on the rise. But only time will truly tell where we are headed.
I hope you found this article interesting and useful. If youâd like to get notified when interesting content gets published here, consider subscribing.
Further reading that might interest you: Is It Time For Us To Reimagine Regular Education? and How Many Decimal Digits Of Pi Do We Really Need?
reality is analog until you get down to quantum level