Updated

Nanotechnology, the process of manipulating matter on an atomic or molecular scale, has been a staple of science fiction for a decade.  Now it's beginning to break out into real science, and some technology critics are already starting to complain.  If they're listened to, the most important technology of the 21st century may be strangled in its crib.

Nanotechnology was first envisioned by famed physicist Richard Feynman, who realized that what chemists do by mixing molecules together in large numbers, scientists could do on a smaller scale by plugging together individual atoms as desired:  "Put the atoms down where the chemist says, and so you make the substance."

Modern nanotechnology researchers want to go beyond synthesizing "substances" (though that has great importance) and use nanotechnology's atom-by-atom construction techniques to produce objects: Tiny, bacterium-sized devices that can repair clogged arteries, kill cancer cells, fix cellular damage from aging, and make other devices of greater size or complexity.

Other researchers believe that nanotechnology will allow for a degree of miniaturization that might allow computers a millionfold more efficiency than anything available now. And still others believe that nanotechnology's tiny devices will be able to unravel mysteries of the microscopic world (cell metabolism, the aging process, cancer) in ways that other tools will not be able to.

So far, research has produced some small devices, but nothing as exotic as those above. But nanotechnologists are refining both their instrumentation and their understanding of nanofabrication at an accelerating rate. Already, the United States government has identified nanotechnology as one of the key technologies for the 21st century, suggesting that its mastery will be essential to economic prosperity and military success. And quite a few companies are interested in developing nanotechnology commercially.

The states of Texas and Virginia, along with the federal government have created initiatives to fund and encourage nanotechnology research and commercialization, in the expectation that it will produce jobs and economic growth.

You'd think that nanotechnology would generate few objections. Some environmentalists love it, seeing it as a clean replacement for dirty and polluting industries.

Environmental activist Terence McKenna once called nanotechnology "the most radical of the green visions." What's more, since it involves only machines, and not DNA, nanotechnology research is free from the kind of "tampering with life" objections sometimes raised where biotechnology is concerned.

But not everyone is happy. As nanotechnology starts to look more real, some environmentalists have started to complain, and even to call for a moratorium on nanotechnology research. Some, like the ETC Group, say that nanoparticles will be the "next asbestos," though there is little evidence that this is the case. Then there are antitechnology activists like Kirkpatrick Sale or Jeremy Rifkin who simply believe that human beings shouldn't have access to such powerful technologies, because they regard humanity as morally unfit for such power.

Others, like Sun Microsystems chief Bill Joy, have expressed fears that nanotechnology, in conjunction with other technologies like artificial intelligence, may make humans obsolete. Joy calls for a "relinquishment" of these technologies, possibly including nanotechnology.

This seems like a bad idea to me. Experience with recombinant DNA research in the 1970s demonstrates that there was a lot of unnecessary worry about risks that turned out, on further research, not to be real. Had that later research (such as the 1977 Cold Spring Harbor experiments that showed that bacteria couldn't fully "read" DNA from higher animals) not been done because of some sort of long-lasting moratorium, we would have been denied the many benefits that biotechnology has brought us: Improved insulin, safer vaccines and treatments and therapies for many other diseases -- not to mention the diagnostic tools without which we never would have been able to identify the HIV virus that causes AIDS, much less come up with new treatments.

Back when the wisdom of DNA research was being debated, scientists Freeman Dyson and Matthew Meselson stated that the biggest risk would come from not going ahead with the research, because of the benefits we would be foregoing. Meselson even said (before AIDS was discovered) that we would need those tools to deal with "forthcoming catastrophes."  They turned out to be righter than they knew. The same is likely to be true where nanotechnology is concerned.

In the area of DNA research, scientists agreed on guidelines for safety, often called the "Asilomar guidelines," after a conference at which they were originally developed. Some now think those guidelines are too restrictive, but they did allow research to continue without the sort of problems that some critics feared. Nanotechnology researchers are at work on a similar set of guidelines that are designed to prevent accidents and minimize abuse while allowing humanity to reap the benefits of a technology that will allow tremendous economic, environmental and medical progress.

That won't satisfy the critics who simply oppose technology on the basis of ideology, but for the rest of us it should be quite satisfactory.

Glenn Harlan Reynolds is a law professor at the University of Tennessee and publishes InstaPundit.Com. He is co-author, with Peter W. Morgan, of The Appearance of Impropriety: How the Ethics Wars Have Undermined American Government, Business, and Society (The Free Press, 1997).

Respond to the Writer