Previous page
Next page

15.4.4 Molecular-level computing.

In the previous section we looked at molecular level computing derived from biological systems. While biocomputing offers one path towards molecular level computing, in this section we explore another approach – that of directly engineering computing systems at the atomic level from the ground up without reference to biology.

The concept of atomic level computing traces its roots back to a speech given by Richard Feynman in 1959 to the American Physical Society at Cal Tech entitled “There’s Plenty of Room at the Bottom”[14]. In his talk, Feynman anticipated the field of microelectronics and outlined some of the approaches, such as photolithography, that were later adopted for the production of integrated circuits. This was an amazing insight, given that computers filled entire rooms in his day and most people were talking about making them larger, not smaller. But Feynman didn’t stop there; he went on to describe some of the advantages of, and approaches for, manipulating matter at the atomic scale. In other words, Feynman’s speech anticipated the field of nanotechnology, which is concerned with the engineering of systems at the level of individual atoms.

Devices such as scanning tunneling microscopes (STMs) and atomic force microscopes (AFMs) can be used to image and even manipulate individual atoms. By 1990, this technology had developed to the point where researchers were able to spell out the letters “IBM” by positioning 35 xenon atoms on a nickel surface, and then create a picture of what they had done. We can easily imagine a memory system in which these xenon atoms are laid out in a 2-D grid arrangement of rows and columns on the nickel surface. A “1” would be represented by the presence of a xenon atom at a particular row and column position, and a “0” by the absence of a xenon atom at row, column position. Both reading and writing of data could be accomplished by use of the STM or AFM.

Though the information density of such a system would be almost unimaginably high, such systems are impractical because the data read and write times would be very, very slow. Regardless of its impracticality, this example does make the point that atomic-scale memory densities are possible.

In addition to being able to store data, a computer system must also be able to manipulate that data under program control.

In the 1990’s there was a lot of excitement surrounding the possibility of constructing molecular-level nanomechanical computer systems. These nanomechanical computers, first described in detail by K. Eric Drexler, would be mechanical computers built of rods and springs – but at the nanometer scale. Such nanotechnology promises far smaller and more energy efficient computers than exist today.

While the idea of directly engineering systems from the bottom up ‘atom by atom’ is intriguing, there has been little measurable progress towards the development of nanomechanical computers over the last few decades – and interest in this approach has waned. Some argue that Drexler’s designs are simply impractical – that various laws of physics preclude the construction of reliable nanomechanical computers. On the other hand, the existence of naturally occurring biological (bio-molecular) systems that encode proteins and transcribe DNA prove that robust molecular-level computing systems can, and indeed already do, exist. The fact that we currently lack the ability to engineer nano-scale computers from the ground up does not mean that breakthroughs are not “just around the corner”.


Footnotes

[14]  A published copy of this talk – well worth a read over half a century after it was given – can be found on the web at http://www.zyvex.com/nanotech/feynman.html. I encourage you to check it out.

Return to top