**Moore’s Law** was first proposed in a magazine article by Intel co-founder Gordon E. Moore in 1965.

Moore's Law says is that the number of transistors that can be packed into a given unit of space will
roughly double every two years. In fact, since the 1970s, Intel has indeed released chips that conform to this "Law."

- An important issue with respect to Moore's Law is that it is sometimes conflated with processor speed. "When the transistor size halves then
the processor speed doubles." In fact, since 2000 the increase in processor speed has slowed down dramatically.

- Moreover, it appears that we may be reaching the
End of Moor's Law.

- Finally, even if all was well with Moor's Law, if we had a computation that had to be done 10 times faster than can be done on
a present single processor system, it would take 6-8 years before it could be done in a useful way. If we could somehow divide the work between 10 systems
we could accomplish the computation today.

**Amdahl’s Law** was named after Gene Amdahl, who presented it in 1967.

AsS(N)=1/((1-P)+(P/N))

Speedup is limited by the total time needed for the sequential (serial) part of the program. For 10 hours of computing, if we can parallelize 9 hours of computing and 1 hour cannot be parallelized, then our maximum speedup is limited to 10 times as fast. If computers get faster the speedup itself stays the same.

Designing and Building Applications for Extreme Scale Systems - Lecture 14

### Carbon Nanotubes

### Quantum Computing