But some physicists and engineers think we might be bumping up against some fundamental physical limits when it comes to transistor size.
Industry researchers said that shrinking the transistor size was a technical tour de force.
You'll usually see the terms "feature size" and "transistor size" used interchangeably, because the most important feature on an integrated circuit is the transistor.
There are significant differences both in doping levels, ions used, and transistor size.
This is because Moore's paper dealt with more than just shrinking transistor sizes.
As transistor sizes shrink, the amount of wasted current (and therefore heat) has declined, but there is still heat being created.
Once we reach the lower limits of transistor size, I'm sure computers will continue to improve.
Thus even relatively low clockspeed devices with very small transistor sizes are still subject to increases in power density if leakage current is not controlled.
The law is based on the steady shrinkage of transistor size.
These can cause data corruption or system shutdown, and are becoming an increasing problem as transistor sizes are reduced.