Moore’s law states that processing speed and computing capabilities will every years.

Moore's Law is the observation by Intel founder Gordon Moore that the number of transistors that can fit on a microchip doubles every 18-24 months. This has held true since the 1960s. In general, the capacity of random access memory (RAM) chips has been improving according to Moores' law. Moores' law, postulated from empirical data, indicates that the storage capacity of memory chips has increased by a factor of four every three years.

Keeping up with this famous prediction is getting harder as chip makers approach the limits of semiconductor materials. Computer microchips made with semiconductors are everywhere: in cars, telephones, ovens, and even singing birthday cards. The 300 mm wafer from International SEMATECH is the beginning of the next generation of semiconductor productivity. New semiconductor facilities being built today will process chips on these larger wafers.

Shockley, Bardeen and Brattain invented the transistor around 1950 and started the modern electronics age. Kilby and Noyce next combined active and passive components on a single chip and invented the integrated circuit. Fairchild's Isoplanar technology made possible medium-scale and larger-scale integrated circuits in 1972.

For semiconductor devices, smaller has always been better. By the late 1970s it was also clear that such tiny devices required better measurements for quality control. Smaller semiconductor devices need smaller measuring devices to ensure quality control. to improve the measurement of a host of semiconductor properties such as circuit linewidths, the thickness and alignment of circuit layers, and the trace chemical composition of specific device regions.

Semiconductor devices have been employed in most electronic devices including information processing apparatuses and home appliances. Currents demands for information processing apparatus, such as computers, require that the apparatus posses a large processing capacity and a high processing speed. Thus, semiconductor devices of the information processing apparatus must also have a high response speed and a large storage capacity. This is achieved through the integration of the semiconductor device.

This increase has been accomplished through a combination of reducing the size of semiconductor devices installed on the chip, and increasing the length of the chip accordingly. The smaller the semiconductor device installed on the silicon chip becomes, the finer the interconnect lines of the semiconductor device must become. However, the signals running through the interconnect lines may interfere with each other when the interconnect lines are arranged close to one another. In fact, delays in the device will be caused by the interference when the spacing of the interconnect lines is below a predetermined value. The specific resistance of the metal used for forming the interconnect lines must be reduced in this situation if a high processing speed of the semiconductor devices is to be maintained.

Common computers provide digital processing in which data are held in positive or negative states (or off and on states) of a device. Digital devices can be semiconducting, magnetic, optical, piezoelectric or other devices. This is referred to as digital computing and it is the economic and technical heart of all current computers, semiconducting devices for computers and computer software. The act of using a digital technology requires that all data must be identified as powers of "2", this in turn requires that data manipulation, speed, storage, etc that expand at this enormous rate.

This digital route requires significant increases in semiconductor chip size, speed and complexity to accommodate even modest improvements in performance. Semiconductor engineers have responded by making devices smaller and with smaller spacing with larger numbers of devices and ever increasing complexity. The requirements for smaller spacing have pushed the limits of material and photolithography capability and it is estimated that designers are reaching the limits of Moores Law (which states that devices will continue to decrease in size and double in capacity every 18 months); in addition, the amount of heat produced by decreasing device spacing is imperiling device performance.

The digital computer is rapidly becoming too large and too complex for large number manipulations such as weather analysis, high level encryption, drug discovery, genetic manipulation and many other applications as yet undiscovered because of the limitations on digital computers.

An entirely new type of computer has been proposed which is based on quantum behavior. The spin state of an atom or group of atoms can be manipulated using a number of methods and the spin state can be detected, and/or controllably altered, using an energy source or detector such as an optical source or detector. An atom or atoms with discrete spin states are analogous to a bit in a traditional computer. However, due to the quantum nature of the spin states, a quantum bit (or qubit) can exist in not just one of two states, but also in a superposition of these states. It is this superposition of states which makes it possible for qubit based computers to analyze information at a much greater speed than is possible for traditional computers.

The name Qubit is quite generic in that devices can be made which use (and need to use) only one Qubit whereas others may use many Qubits. Devices which have been proposed include single Qubit optical amplifiers for encrypted and very high speed messages, multiple Qubit devices for information storage, and multiple Qubit devices for high speed and high density computing. Because the atom can exist in a large number of spin states simultaneously, the interaction of spin states enables a high number of computations with only a small number of atoms. The entire Qubit chip of a supercomputer might well be smaller than a fingernail. In addition, Qubit technology holds promise for combining with optical waveguide technology building high speed optical busses for conventional computers while in creasing encryption capabilities.



NEWSLETTER

Join the GlobalSecurity.org mailing list


What does the Moore's law state?

Moore's law is a term used to refer to the observation made by Gordon Moore in 1965 that the number of transistors in a dense integrated circuit (IC) doubles about every two years.

How long will Moore's law last?

As we continue to miniaturize chips, we'll no doubt bump into Heisenberg's uncertainty principle, which limits precision at the quantum level, thus limiting our computational capabilities. James R. Powell calculated that, due to the uncertainty principle alone, Moore's Law will be obsolete by 2036.

Does Moore's law have a limit?

There is a limit to Moore's Law, however. As transistors approach the size of a single atom, their functionality begins to get compromised due to the particular behavior of electrons at that scale. In a 2005 interview, Moore himself stated that his law “can't continue forever.”

What is Moore's law How does it affect the capabilities of computers?

He's talking about the so-called law that says the number of transistors that can be fit on a computer chip will double every 18 months, resulting in periodic increases in computing power. According to Kaku: "In about ten years or so, we will see the collapse of Moore's law.