Some current researchers are arguing that Moore’s Law may be more limited than many have thought. The law states that the development of digital electronics is exponential. The original formulation was that the number of transistors that can be crammed onto a chip doubles every 18 months (originally it was every year, then every two years). A popular formulation is that the unit price of computing halves every 18 months.
The law, in whatever form, is important because it determines what we can expect to do over the next decades. For example, stealth never creates a signature-less airplane or missile or submarine. Instead, it reduces signatures by some large amount, leaving a much smaller signature to distinguish from the surrounding noise. Signal processing claws back a recognizable signature. How well it works depends on how well the surrounding random noise can be separated from the man-made signature of the target. How we evaluate the future of stealthy aircraft depends in large part on how we evaluate the future of radar-signal processing. The worse that future, the brighter the future of stealth—and conversely.
The future of Moore’s Law is typically tied to chip production. Manufacturers have managed to create smaller and smaller features (individual devices) on the faces of chips, so they have crammed in more and more of them (to do that the cost of fabrication plants has risen exponentially). Chips are becoming three-dimensional structures, and obviously it takes even more elaborate equipment to produce them.
Smaller, Hotter, Faster
The new insight is that the limit may be imposed by heat. Every computation a chip carries out produces some heat. All the numbers in a chip are represented by small voltages (the difference indicates whether a bit is a zero or a one), and every time that voltage is flipped, or a number is sent in electrical form from one register to another, energy is involved. One of the unpleasant facts of thermodynamics is that heat is inevitably given off. Anyone who uses a digital device, from a cell phone to a computer, knows as much; cell phones get hot, and so do laptops.
What may be less obvious is that heat has consequences for the computer as well as for your hand or your lap. The world is essentially random, and the level of randomness is measured by temperature. The hotter the chip, the greater the probability enough energy is present to make a one into a zero, or vice versa, without that being intended. That is quite apart from the probability that a random cosmic ray will also flip some bits.
To make chips smaller and faster, manufacturers have reduced the energy involved: The energy difference between ones and zeroes has been cut. That has to be done if a more complex chip, which does more computations per second, is to live off much the same power supply. Unfortunately, the scale of randomness in a chip depends not on the complexity of the chip but on its temperature and also on the cosmic rays (including products of the sun) that hit it (which is why electromagnetic pulse is a devastating problem). The smaller the difference between ones and zeroes, the worse the effect of that random energy.
To make matters more interesting, shielding chips from cosmic rays (and the effects of sunspots, for example) probably makes it much more difficult to cool them. Making a chip three-dimensional also would make it much more difficult to cool its inner elements. It is easy to imagine solving the heat problem by submerging a chip in liquid nitrogen, but it is not so obvious how the heat flowing from a hot interior to a cool exterior would affect chip performance.
We already have some idea of how heat can affect chips, because the military uses specially designed high-temperature chips for special applications. Typically the way to cope with high temperature is to make the difference between the ones and zeroes larger, so that bits can be discerned better despite the way in which the high outside temperature is shaking up the material of the chip. The extra heat generated by greater power fed into the chip is negligible compared with the outside heat in which the chip is working. However, it seems likely that such a chip operates considerably more slowly, simply because every computation involves a lot more power.
Chip designers already know that heat is a problem, and they have some solutions. They can adjust software so that portions of the chip not being used by the current calculation are not turned on. That is one virtue of a multi-core chip. In effect, however, these solutions reduce computing power. The rated performance of the chip may be much higher, but if a large part of it is not working at any one time, its effective power is not nearly so impressive.
The Cost Factor
The more sophisticated design and configuration of the chip make it a lot more expensive to produce (remember that part about the exponential rise in the cost of chip-making plants). At one time that would not have been a great problem. The military and other high-end users drove the computing industry, and they were willing to pay what was needed. However, the civilian world now drives computing. That is why so much of what the military buys is commercial-off-the-shelf. It is not clear what part of the civilian world drives the industry, however. Is it consumer goods such as smart phones and laptops? Or is it main frames or servers? The chip market determines the point at which it is no longer possible to justify that exponential rise in manufacturing investment—the end of which would also end Moore’s Law.
The answer is important. It seems unlikely that smart phones and laptops can handle much hotter chips, simply as a matter of what people will tolerate. Indeed, the rise of smart phones probably reduces the acceptable heat level, because they are switched on, often to more demanding applications, for much longer than cell phones, and neither they nor cell phones offer much cooling. A big main frame can employ some form of liquid cooling.
It may be that the appropriate formulation of Moore’s Law is that the unit cost of computing is falling. That will not necessarily lead to more powerful individual chips, because the demand for such devices may not exist (it is already true that consumers often see only a small performance improvement when they invest in chips nominally many times as powerful as their predecessors). If the most stressing consumer chip application is entertainment, then it may turn out that beyond some point more computing power does not translate into visible improvement, at which point it will no longer make much commercial sense. For example, we have known for more than a century that a human eye sees 30 frames per second as smooth motion (as in a movie or television). A chip that presented a hundred frames a second would offer a useless degree of improvement.
It may be that the future of high-end computing is to gang together numerous chips. That may account for the current interest in cloud computing, in which some external server is used for very demanding computing (or storage) applications. If real power requires that many computers or chips work in tandem, then the space and power devoted to them become significant issues. Incidentally, high power may cover more than what might be obvious. Perhaps the first major application of cloud computing was high-capacity e-mail, in which the cloud both provides new messages and stores an increasing volume of existing ones. In this case the demand is for safe storage, which means that the cloud provider must maintain multiple duplicate storage computers. With any form of cloud computing, one key issue is of course the connection between user and cloud: how capacious is it, how secure, and how quick? The noisier the connection, the more computing power has to go into simply insuring that data pass correctly.
If this analysis is correct, then for our purposes a combination of real estate and power will often be decisive. A ship or submarine offers both, so as Moore’s Law tapers off it can fight back by multiplying chips in its servers. The real limits on computing will probably be factors such as fiber-optic bandwidth and the ease with which storage can be accessed. Airplanes are smaller and have less power—and less ability to siphon off heat. However, they can still accommodate multiple processors. Missiles are further down the scale. For a missile or a small UAV, the equivalent of cloud computing is some degree of command guidance or a guidance technique like track-via-missile. The bottom of the scale would be the individual computer the U.S. Army often espouses, and the key issue is likely to be how to limit the desired applications so that it remains viable.