NANOTECHNOLOGY: MASTERING THE MATERIAL WORLD
From chapter 10 (The Man in the White Suit), Films from the Future: The Technology and Morality of Science Fiction Movies
On December 29, 1959, the physicist Richard Feynman gave a talk at the annual meeting of the American Physical Society, which was held that year at the California Institute of Technology. In his opening comments, Feynman noted:
“I would like to describe a field, in which little has been done, but in which an enormous amount can be done in principle.
This field is not quite the same as the others in that it will not tell us much of fundamental physics (in the sense of, “What are the strange particles?”) but it is more like solid-state physics in the sense that it might tell us much of great interest about the strange phenomena that occur in complex situations. Furthermore, a point that is most important is that it would have an enormous number of technical applications. What I want to talk about is the problem of manipulating and controlling things on a small scale.”
Feynman was intrigued with what could be achieved if we could only manipulate matter at the scale of individual atoms and molecules. At the time, he was convinced that scientists and engineers had barely scratched the surface of what was possible here, so much so that he offered a $1,000 prize for the first person to work out how to write out a page of a book in type so minuscule it was at 1:25,000 scale. (The prize was won twenty-six years after Feynman set the challenge by physicist Tom Newman, who wrote the first page of Charles Dickens’ A Tale of Two Cities on a 200-μm square piece of plastic, using electron-beam lithography. For more information, see Katherine Kornei (2016) “The Beginning of Nanotechnology at the 1959 APS Meeting,” APS News, November 2016)
Feynman’s talk didn’t garner that much attention at first. But, over the following decades, it was increasingly seen as a milestone
in thinking about what could be achieved if we extended our engineering mastery to the nanometer scale of atoms and molecules. In 1986, Eric Drexler took this up in his book Engines of Creation and popularized the term “nanotechnology.” Yet it wasn’t until the 1990s, when the US government became involved, that the emerging field of nanotechnology hit the big-time.
What intrigued Feynman, Drexler, and the scientists that followed them was the potential of engineering with the finest building blocks available, the atoms and molecules that everything’s made of (the “base code” of physical materials, in the language of chapter nine). As well as the finesse achievable with atomic-scale engineering, scientists were becoming increasingly excited by some of the more unusual properties that matter exhibits at the nanoscale, including changes in conductivity and magnetism, and a whole range of unusual optical behaviors. What they saw was an exciting new set of ways they could play with the “code of atoms” to make new materials and products.
In the 1980s, this emerging vision was very much in line with Drexler’s ideas. But in the 1990s, there was an abrupt change in direction and expectations. And it occurred at about the time the US federal government made the decision to invest heavily in nanotechnology.
In the 1990s, biomedical science in the US was undergoing something of a renaissance, and federal funding was flowing freely into the US’s premier biomedical research agency, the National Institutes of Health. This influx of research funding was so prominent that scientists at the National Science Foundation—NIH’s sister agency—worried that their agency was in danger of being marginalized. What they needed was a big idea, one big enough to sell to Congress and the President as being worthy of a massive injection of research dollars.
Building on the thinking of Feynman, Drexler, and others, the NSF began to develop the concept of nanotechnology as something they could sell to policy makers. It was a smart move, and one that was made all the smarter by the decision to conceive of this as a cross- agency initiative. Smarter still was the idea to pitch nanotechnology as a truly interdisciplinary endeavor that wove together emerging advances in physics, chemistry, and biology, and that had something for everyone in it. What emerged was a technological platform that large numbers of researchers could align their work with in some way, that had a futuristic feel, and that was backed by scientific and business heavyweights. At the heart of this platform was the promise that, by shaping the world atom by atom, we could redefine our future and usher in “the next Industrial Revolution.” (The report “Nanotechnology: Shaping the World, Atom by Atom” was published by the National Science and Technology Council Committee on Technology, and the Interagency Working Group on Nanoscience, Engineering and Technology in 1999.)
This particular framing of nanotechnology caught on, buoyed up by claims that the future of US jobs and economic prosperity depended on investing in it. In 2000, President Clinton formed the US National Nanotechnology Initiative, a cross-agency initiative that continues to oversee billions of dollars of federal research and development investment in nanotechnology. (In the spirit of full disclosure, I was involved in the early days of the National Nanotechnology Initiative, and was the first co-chair of the interagency committee within the NNI to examine the environmental and health implications of nanotechnology.)
Eighteen years later, the NNI is still going strong. As an initiative, it has supported some incredible advances in nanoscale science and engineering, and it has led the growth of nanotechnology the world over. Yet, despite the NNI’s successes, it has not delivered on what Eric Drexler and a number of others originally had in mind. Early on, there was a sharp and bitter split between Drexler and those who became proponents of mainstream nanotechnology, as Drexler’s vision of atomically precise manufacturing was replaced by more mundane visions of nanoscale materials science.
With hindsight, this isn’t too surprising. Drexler’s ideas were bold and revolutionary, and definitely not broadly inclusive of existing research and development. In contrast, because mainstream nanotechnology became a convenient way to repackage existing trends in science and engineering, it was accessible to a wide range of researchers. Regardless of whether you were a materials scientist, a colloid chemist, an electron microscopist, a molecular biologist, or even a toxicologist, you could, with little effort, rebrand yourself as a nanotechnologist. Yet despite the excitement and the hype—and some rather Transcendence-like speculation—what has come to be known as nanotechnology actually has its roots in early-twentieth- century breakthroughs.
In 1911, the physicist Earnest Rutherford proposed a novel model of the atom. Drawing on groundbreaking experiments from a couple of years earlier, Rutherford’s model revolutionized our understanding of atoms, and underpinned a growing understanding of, not only how atoms and molecules come together to make materials, but how their specific arrangements affect the properties of those materials.
Building on Rutherford’s work, scientists began to develop increasingly sophisticated ways to map out the atomic composition and structure of materials. In 1912, it was discovered that the regular arrangement of atoms in crystalline materials could diffract X-rays
in ways that allowed their structure to be deduced. In 1931, the first electron microscope was constructed. By the 1950s, scientists like Rosalind Franklin were using X-rays to determine the atomic structure of biological molecules. This early work on the atomic and molecular makeup of materials laid the foundations for the discovery of DNA’s structure, the emergence of transistors and integrated circuits, and the growing field of materials science. It was a heady period of discovery, spurred on by the realization that atoms, and how they’re arranged, are the key to how materials behave.
By the time Feynman gave his lecture in 1959, scientists were well on the way to understanding how the precise arrangement of atoms in a material determines what properties it might exhibit. What they weren’t so good at was using this emerging knowledge to design and engineer new materials. They were beginning to understand how things worked at the nano scale, but they still lacked the tools and the engineering dexterity to take advantage of this knowledge.
This is not to say that there weren’t advances being made in nanoscale engineering at the time—there were. The emergence
of increasingly sophisticated synthetic chemicals, for instance, depended critically on scientists being able to form new molecules by arranging the atoms they were made of in precise ways, and, in the early 1900s, scientists were creating a growing arsenal of new chemicals. At the same time, scientists and engineers were getting better at making smaller and smaller particles, and using some of the convenient properties that come with “smallness,” like adding strength to composite materials and preventing powders from caking. By the 1950s, companies were intentionally manufacturing a range of nanometer-scale powders out of materials like silicon dioxide and carbon.
As the decades moved on, materials scientists became increasingly adept at manufacturing nanoscopically small particles with precisely designed properties, especially in the area of catalysts. Catalysts work by increasing the speed and likelihood of specific chemical reactions taking place, while reducing the energy needed to initiate them. From the early 1900s, using fine particles as catalysts—so- called heterogeneous catalysts—became increasingly important in industry, as they slashed the costs and energy overheads of chemical processing. Because catalytic reactions occur at the surface of these particles, the smaller the particles, the more overall surface area there is for reactions to take place on, and the more effective the catalyst is.
This led to increasing interest in creating nanometer-sized catalytic particles. But there was another advantage to using microscopically small particles in this way. When particles get so small that they are made of only a few hundred to a few thousand atoms, the precise arrangement of the atoms in them can lead to unexpected behaviors. For instance, some particles that aren’t catalytic at larger sizes become catalytic at the nano scale. Other particles interact with light differently; gold particles, for instance, appear red below a certain size. Others still can flip from being extremely inert to being highly reactive.
As scientists began to understand how particle size changes material behavior, they began developing increasingly sophisticated particle-based catalysts that were designed to speed up reactions and help produce specific industrial chemicals. But they also began to understand how the precise atomic configuration of everything around us affects the properties of materials, and can in principle be used to design how a material behaves.
This realization led to the field of materials science growing rapidly in the 1970s, and to the emergence of novel electronic components, integrated circuits, computer chips, hard drives, and pretty much every piece of digital gadgetry we now rely on. It also paved the way for the specific formulation of nanotechnology adopted by the US government and by governments and scientists around the world.
In this way, the NNI successfully rebranded a trend in science, engineering, and technology that stretched back nearly one hundred years. And because so many people were already invested in research and development involving atoms and molecules, they simply had to attach the term “nanotechnology” to their work, and watch the dollars flow. This tactic was so successful that, some years ago, a colleague of mine cynically defined nanotechnology as “a fourteen-letter fast track to funding.”
Despite the cynicism, “brand nanotechnology” has been phenomenally successful in encouraging interdisciplinary research and development, generating new knowledge, and inspiring a new generation of scientists and engineers. It’s also opened the way to combining atomic-scale design and engineering with breakthroughs in biological and cyber sciences, and in doing so it has stimulated technological advances at the convergence of these areas. But “brand nanotechnology” is most definitely not what was envisioned by Eric Drexler in the 1980s.
The divergence between Drexler’s vision of nanotechnology and today’s mainstream ideas goes back to the 1990s and a widely publicized clash of opinions between Drexler and chemist Richard Smalley. Early in the evolution of the NNI, Drexler went head to head with Nobel Laureate Richard Smalley as they clashed over the future of nanotechnology. A December 2003 cover story in the magazine Chemical & Engineering News provided a point-counterpoint platform for Drexler and Smalley to duke it out. (Drexler talks about the subsequent marginalization of his ideas in his 2013 book, “Radical Abundance: How a Revolution in Nanotechnology Will Change Civilization.”)
Where Drexler was a visionary, Smalley was a pragmatist. More than this, as the co-discoverer of the carbon-60 molecule (for which he was awarded the Nobel Prize in 1996, along with Robert Curl and Harry Kroto) and a developer of carbon nanotubes (a highly novel nanoscale form of carbon), he held considerable sway within established scientific circles. As the US government’s concept of nanotechnology began to take form, it was Smalley’s version that won out and Drexler’s version that ended up being sidelined.
Because of this, the nanoscale science and engineering of today looks far more like the technology in The Man in the White Suit than the nanobots in Transcendence. Yet, despite the hype behind “brand nano,” nanoscale science and engineering is continuing to open up tremendous opportunities, and not just in the area of stain- resistant fabrics. By precisely designing and engineering complex, multifunctional particles, scientists are developing new ways to design and deliver powerful new cancer treatments. Nanoscale engineering is leading to batteries that hold more energy per gram of material, and release it faster, than any previous battery technology. Nanomaterials are leading to better solar cells, faster electronics, and more powerful computers. Scientists are even programming DNA to create new nanomaterials.
Hype aside, we are learning to master the material world, and become adept in coding in the language of atoms and molecules. But just as with Stratton’s wonder material, with many of these amazing breakthroughs that are arising from nanoscale science and engineering, there are also unintended consequences that need to be grappled with.