In my previous essay on how to benefit from computer science, I covered the concepts of parallelism, coordination overhead, pipelining, latency and throughput. In this essay, I will be covering the concept of time-space-money trade-off (also known as resource trade-off).
We will start by covering the basics of why this class of problems is relevant for the world of computer science. Following this, we will transition to real-world examples of this class of problems in action. Finally, we will see how we could solve real-world problems using inspirational solutions from the computer science world. Let us begin.
This essay is supported by Generatebg
The Time-Space-Money Trade-off Problem in Computer Science
When it comes to computers, ‘time’ refers to how fast a certain problem can be solved, and ‘space’ refers to how much memory/storage is required to solve the said problem. Let us dive deeper into ‘time’ and ‘space’ for now; we will eventually get to ‘money’.
As I had mentioned in my previous essay, the typical computer processor can process information several orders of magnitude faster than the physical storage. Therefore, if the situation is not handled properly, the processor could end up waiting idly while data is sent/retrieved from the storage.
Therefore, computer processors are designed to do meaningful work while they wait for information to be sent/retrieved from the storage. The processors themselves have very limited storage. So, computer architects have to constantly decide between processing information as fast as possible using the processor’s memory and/or storing/retrieving information to/from the physical storage device.
When the physical storage is used to solve problems, it enables larger-scale problems to be solved but costs more time. This is what we mean by the time-space trade-off. How about money? Well, it is easier to understand that part with the help of an example.
Consider an IoT (Internet of Things) device such as a smart alarm clock. Such a device requires a fast processor but not much memory. On the other hand, consider an offshore oil rig or an inter-galactic-satellite. Such devices require lots of memory but would typically feature a slow processor. This is because slower processors consume lesser power. And lesser power consumption equals more economic efficiency (cost/money).
Now that we have covered the concept of time-space-money trade-off in the context of computer science, let us jump into the real world.
The Time-Space-Money Trade-off in the Real World
Let us start with the illustration from the title image. Imagine that you need to buy groceries and have two options: the ‘Elito’ supermarket that is just 200 metres away from where you are, and the ‘Elcheapo’ supermarket that is 5 kilometres away.
As the names suggest, groceries at ‘Elcheapo’ are significantly cheaper than at ‘Elito’, but you save significantly more time if you opt for ‘Elito’. This is a classic real-world example of the time-money trade-off.
Now, consider another situation where you are throwing a party for your birthday. You have invited 20 people. You don’t know for sure if 10 cases of assorted drinks would suffice, so you choose to err on the safe side and order 20 cases. At the end of the party, it turns out that 10 cases were indeed enough and you are left with 10 extra cases in your storage room/garage.
This is a classic real-world example of the time-space trade-off. Had you ordered just 10 cases and had run out of drinks in the middle of the party, you’d have had to drive to the store again, which costs valuable party time.
Having covered examples of real-world scenarios where the time-space-money trade-off pops up, let us now see how we could find inspiration from the world of computer science to solve such problems.
How to Benefit from Computer Science in Real Life — Time-Space-Money Trade-off
In the computer world, the time-space-money trade-off problem is solved using a technique known as caching. Using this technique, a small amount of problem-relevant data is temporarily moved very close to the processor (using faster physical storages).
Because of the close proximity (as compared to the larger physical storage) and special faster architecture, cached memory enables a good compromise for time-space-money trade-off problems. Here is an illustration of how such a cached memory architecture might look like (with two levels of cache hierarchy):
So, how can we leverage the computer world’s caching solution to real-world problems? In one sentence:
Keep more important stuff closer, and less important stuff farther away.
This sounds like such an obvious thing to say, but let me tell you that the rabbit hole goes deeper. Just take a look at your work desk or kitchen counter. The number of unnecessary things that the most of us leave stacked in these environments due laziness really eats into time-space-money trade-off decisions in real-life.
In the case of your supermarket trip scenario, it makes sense to get the most essential groceries from ‘Elito’ on a short-term basis, and visit ‘Elcheapo’ once in a while, but load-up on most of your grocery requirements there.
In the case of your birthday party scenario, one efficient solution could be to purchase just 10 cases initially, but when you are on your 8th case mid-way through the party, order more cases using a more expensive delivery service. This way, you save time at the cost of money. But you don’t have to worry about storing more cases at your place.
Having seen a couple of examples of how the caching technique could be leveraged to solve real-world problems, let us explore one last example of how tech companies take advantage of this.
Computer Science Meets the Real-World
Over the past few years, streaming services have been getting more and more popular. Let us say that you have subscribed to one such music streaming service. This service provides you with an optimized algorithm that tailor-fits your music taste with the songs that appear on your recommendation list.
Did you know that such a streaming service also faces time-space-money trade-off decisions? If you stream every song you listen to every single time, it eats into your mobile data usage and hits the music servers hard as well. On the other hand, if the streaming service downloads every single song you listen to onto your phone’s storage, then there would be no space left in no time.
So, how does such a streaming service solve this problem? Well, it uses a caching technique to store temporary copies of your most frequented songs onto your phone memory. The developers could design the algorithm in such a way that their servers are not hit hard, yet your data usage is ‘just high enough’ for them to offer you ‘download music’ as a paid option. So, now you know!
Final Remarks
In conclusion, both computers and human-beings face the challenge of deciding between a quick and costly option (in terms of space and/or money) or a slow and cheaper option. Computer science tackles this problem using a technique known as caching.
Caching allows high-speed memory access physically close to the processor. Similarly, human-beings can optimize to keep the more important things closer and the less important things farther away.
You can use this approach to design your workspace/kitchen space, choose a location for your new home, develop a smartphone app, etc. The list of real-world applications and potential benefits is limitless!
If you’d like to get notified when interesting content gets published here, consider subscribing.
Further reading that might interest you: Why Are Analogue Computers Really On The Rise Again? and How To Benefit From Computer Science In Real Life (I)?
If you would like to support me as an author, consider contributing on Patreon.
Comments