Moore's Law

Moore’s Law was first proposed in a magazine article by Intel co-founder Gordon E. Moore in 1965.

Moore's Law says is that the number of transistors that can be packed into a given unit of space will roughly double every two years. In fact, since the 1970s, Intel has indeed released chips that conform to this "Law."

However:

  1. An important issue with respect to Moore's Law is that it is sometimes conflated with processor speed. "When the transistor size halves then the processor speed doubles." In fact, since 2000 the increase in processor speed has slowed down dramatically.
     
  2. Moreover, it appears that we may be reaching the End of Moor's Law.
     
  3. Finally, even if all was well with Moor's Law, if we had a computation that had to be done 10 times faster than can be done on a present single processor system, it would take 6-8 years before it could be done in a useful way. If we could somehow divide the work between 10 systems we could accomplish the computation today.
Designing and Building Applications for Extreme Scale Systems - Lecture 15

which leads to:


Amdahl's Law

Amdahl’s Law was named after Gene Amdahl, who presented it in 1967.

In general terms, Amdahl’s Law states that in parallelization, if P is the proportion of a system or program that can be made parallel, and 1-P is the proportion that remains serial, then the maximum speedup S(N) that can be achieved using N processors is:

                S(N)=1/((1-P)+(P/N))

As N grows the speedup tends to 1/(1-P).
 
Speedup is limited by the total time needed for the sequential (serial) part of the program. For 10 hours of computing, if we can parallelize 9 hours of computing and 1 hour cannot be parallelized, then our maximum speedup is limited to 10 times as fast. If computers get faster the speedup itself stays the same.
 
Designing and Building Applications for Extreme Scale Systems - Lecture 14

The Task: Find algorithms with larger P.


Down The Road