Scalability

Standard

The performance of a system will depend upon the size of the  system, i.e., the number of processors, and generally the larger the system the  better, but this comes with a cost. Scalability is a rather imprecise term. It is used to indicate a hardware design that allows the system to be increased in size and in doing so to obtain increased performance. This could be described as architecture or hardware scalability. Scalability is also used to indicate that a parallel algorithm can accommodate increased data items with a low and bounded increase in computational steps. This could be described as algorithm scalability.

Of course, we would want all multiprocessors systems to be architecturally scalable, but this will depend heavily upon the design of the system. Usually, as we add processors to a system, the interconnection network must be expanded. Greater communication delays and increased contention  results, and the system efficiency. The underlying goal of most multiprocessors designs is to achieve scalability, and this is reflected in the multitude of interconnection networks that have been devised.

Combined architecture/algorithmic scalability suggests that increased problem size can be accommodated with increased system size for a particular architecture and algorithm. Whereas increasing the size of the system clearly means adding processors, increasing the size of the problem requires clarification. Intuitively, we would think of the number of data elements being processed in the algorithm as a measure of size. However, doubling the problem size would not necessarily double the number of computational steps. It will depend upon the problem. For example, adding two matrices, has this effect, but multiplying matrices does not. The number of computational steps for multiplying matrices quadruples. Hence, scaling different problems would imply different computational requirements. An alternative definition of problem size is to equate problem size with the number of basic steps in the best sequential algorithm. Of course, even with this definition, if we increase the number of data points, we will increase the problem size.

Gustafson presented an argument based upon scalability concepts to show that Amdahl’s Law was not as significant as first supposed in determining the potential speedup limits. Gustafson attributed formulating the idea into an equation. Gustafson makes the observation that in practice a larger multiprocessor usually allows a larger-size problem to be undertaken in a reasonable execution time. Hence in practice, the problem size selected frequently depends of the number of available processors. Rather than assume that the problem size is fixed, it is just a valid to assume that the parallel execution time is fixed. As the system size is increased, the problem size is increased to maintain constant parallel-execution time. In increasing the problem-size, Gustafson also makes the case that the serial section of the code is normally fixed and does not increase with the problem size.

A Driving Metaphor(between Amdahl and Gustafson)

Amdahl’s Law approximately suggests:

Suppose a car is traveling between two cities 60 miles apart, and has already spent one hour traveling half the distance at 30 mph. No matter how fast you drive the last half, it is impossible to achieve 90 mph average before reaching the second city. Since it has already taken you 1 hour and you only have a distance of 60 miles total; going infinitely fast you would only achieve 60 mph.

Gustafson’s Law approximately suggests:

Suppose a car has already been traveling for some time at less than 90mph. Given enough time and distance to travel, the car’s average speed can always eventually reach 90mph, no matter how long or how slowly it has already traveled. For example, if the car spent one hour at 30 mph, it could achieve this by driving at 120 mph for two additional hours, or at 150 mph for an hour, and so on.

References:

Wikipedia,

Parallel Programming – Techniques and Applications 2th Edition, Barry Wilkinson