Last week we touched on a few of the laws governing
algorithm performance on computers. These laws talked a lot about the nature of
computers, how they work and communicate with each other, and the impact that
this interaction has on the performance of software. This week we are going to
talk about a few of the laws that govern what is possible in the development of
computer hardware.
The most well-known of these laws is Moore’s Law, which
recently reached an end of its usefulness. In 1965, Gordon Moore, the
co-founder and CEO of Intel, observed that the density of transistors on an
integrated circuit doubled about every two years. There are several arguments
that the usefulness of this in regards to computational power ended in 2001,
but Intel as a leader in the processor industry argues that it is still in
effect and improving computing hardware today. For a couple of reasons, I am in
the group that believes Moore’s Law died in 2001. The first is that in 2001, we
produced the fastest single core processor every created, and we have not
gotten any faster. In fact, the single core speed is now about half as fast as
what it was in 2001. The second is that starting in 2001, the speed benefit of
smaller transistors went away; as they got smaller they also got slower, but
Intel as well as other processor manufactures started putting more cores on a
single chip. We now have processors with 40 or more cores in a chip, but the
speed is around half the speed of the single core chips of the 1990s. It has
been agreed that Moore’s Law no longer applies as transistors have reached
sizes nearing the size of a single atom and can no longer get smaller, so
Moore’s Law is in effect dead, but still worthy of mention.
Koomey’s Law is very similar to Moore’s Law but is related
to the energy consumption of a processor. Jonathan Koomey observed that as the
transistors got smaller, they were getting more energy efficient at an average
rate of doubling in efficiency every 1.5 years, but the efficiency has fallen
off as Moore’s law slowed, resulting in a current doubling of efficiency only
every 2.6 years.
Dennard’s Law, referred to as Dennard Scaling, was an
observation in 1974 by Robert H. Dennard and his colleagues. He was attempting
to provide an explanation as to how processor manufacturers were able to
increase the clock frequency, and thus the speed of the processor, without
significantly increasing the power consumption. Dennard basically discovered
that, “As transistors get smaller, their power density stays constant, so that
the power use stays in proportion with area; both voltage and current scale
downward with the length of the transistor.” Dennard scaling broke down in the
2005-2006 era as it was ignoring some key factors to the overall performance of
the transistor. These factors include the leakage current, which is the amount
of current loss across the gate of the transistor, and the threshold voltage,
which is the minimum voltage necessary to open the transistor gate. This
established a minimum power consumption per transistor, regardless of the size.
Rock’s Law is an economic factor related to the cost of
processor manufacturing, and probably one of the main reasons the scaling of
Moore’s law was over years and not months. Arthur Rock, an investor in many
early processor companies, observed that the cost of semiconductor fabrication
plants doubles every four years. Rock’s Law is also known as Moore’s second law
and is the economic flip side of Moore’s Law.
The last law I will talk about on computer processor design
is Bell’s Law. In 1972, Gordon Bell observed that over time, low cost, general
purpose computing architectures evolve, become mainstream, and eventually die
out. You can see this from your cell phone; most people replace their cell
phones every two years. However, his law was dealing with larger scale systems.
For example, roughly every decade a new class of computers results in new usage
and establishes new industry - the 1960s mainframes, 1970s minicomputers, 1980s
personal computers, 1990s worldwide web, 2000s cloud computing, and 2010s
handheld devices and wireless networks. Predictions indicate that the 2020s
will be the decade of quantum computing.
No comments:
Post a Comment