Thursday, July 25, 2019

Computing Laws Part 1


Over the many years in the computing industry, I have been introduced to several “laws” of computing that seem to fight against computing every reaching its maximum potential. Unless you have a degree in computer science or have studied at least some introductory computer courses you have probably never heard of any of these “laws” of computing.
First, I would like to say that most of them are not really laws, like the law of gravity for instance, but more like observations of phenomenon that seem to guide the computing industry. Most of them are related primarily to computing hardware trends, parallel software performance and computing system efficiency. I will talk the next couple of weeks about various ones of these laws, and introduce a law of my own in the final week.
The first of these laws in Amdahl’s Law. Gene Amdahl observed a phenomenon in parallel computing that allowed him to predict the theoretical speedup of an application when using multiple processors. He summarized the law like this: “The speedup of a program using multiple processors is limited by the time needed for the sequential fraction of the program.” In other words, every computer program, just like every task you complete in a given day, has a serial part that can only be done in a particular order and cannot be split up. A great example is if you are mowing a lawn, you have to first fill the lawn mower up with gas and start the mower before you can mow the lawn. If you have a bunch of friends with a bunch of lawn mowers, you still cannot get any faster than the amount of time to fill the lawnmowers with gas. The mowing can be parallel up to the number of friends with mowers, but you still have the limiting factor of the sequential step. It is obvious from Amdahl’s law that if there is no limit to the number of processors that can complete a given task, you still hit a limit of maximum performance.
A related law to Amdahl’s Law is Gunther’s Law, also known as the Universal Scalability Law. In 1993 Neil Gunther observed and presented a conceptual framework for understanding and evaluating the maximum scalability of a computing system. He understood that for any given task, even one that can be done entirely in parallel, like many friends help mow your lawn, that you still reach a limit at which adding more friends can actually cause mowing your lawn to take longer. The same thing happens in computer algorithms. They all have a sweet spot where the number of processes running balance out with the size of the problem and the hardware available to give the best performance for the program. I have always liked to think of Gunther’s Law as the law of too many friends. If all your friends and their mower cover your entire lawn they get in each other’s way and your lawn never gets mowed.
In 1988 John L. Gustafson and Edwin Basis wrote an article “Reevaluating Amdahl’s Law” in which they show that you can retain scalability by increasing the problem size. In other words, you can overcome the impact of the serial part of a problem by making the problem significantly large, minimizing the impact of the serial component, and along the same lines also improve upon the predicted maximum performance of Gunther’s Law. In other words, if you have many friends with many lawn mowers and a small lawn you are limited to scaling to only a small number of friends, but if you increase the size of the lawn, the serial factor of filling the mowers with gas and starting them becomes significantly smaller than the task of mowing the lawn. You also are able to given all of your friends a significant amount of lawn to mow in order to avoid the problems caused by having too many mowers. 

No comments: