The Hidden World
As technology continues to improve, the detail in which we can scrutinize and manipulate the world around us continues to sharpen. But with these unprecedented perspectives come fresh mysteries. Unfettered access to the molecular scale has created more new questions than answers to old quandaries. Never before have there been so many things to keep track of, catalog, investigate, and learn.
The nano world has proven tricky to navigate. Basic principles of chemistry and physics that scientists take for granted may not hold true at the nanoscale or molecular level. Saroj Nayak, associate professor of physics, applied physics, and astronomy, says it is common for materials to also behave in unique and unexpected ways at these small sizes.
This is where Rensselaer’s new supercomputer comes into play.
Observing groups of tens or hundreds of millions of atoms creates unfathomable amounts of data. Programming computer models of molecular or nanoscale environments and running them through various simulations, and then watching how the system works as a whole while observing both the behavior of each group of molecules and each atom, creates even larger volumes of data.
Compiling and collecting the data that results from a robust molecular or nanoscale experiment could take millions of human lifetimes. A personal computer or laptop could do it more efficiently, but the calculations would likely still take tens or hundreds of thousands of years.
A typical supercomputer may be able to condense the calculation time down to days. CCNI, however, should be able to tackle these same feats in hours or in some cases even minutes. It’s helping scientific breakthroughs happen sooner rather than later. Even with less complex problems, CCNI’s speed will allow researchers to keep track of and model additional variables, allowing for more robust results. The more data incorporated into a simulation, the clearer the end picture is going to be.
From Small Things...
As the name implies, CCNI will advance research utilizing the leading-edge capabilities of nanotechnology. “It’s not just a tool, it’s a whole new infrastructure,” says Nayak. “Things are going to start moving fast.”
The study of nanotechnology the ability to manipulate matter at the atomic and molecular level has warranted relentless research over the past two decades, and the results have been fruitful. Carbon nanotubes hold the promise of being a foundation for everything from artificial muscles, super-efficient batteries, and even bulletproof vests that are exponentially stronger than the Kevlar vests used today by the military and law enforcement agencies.
The science of tiny stuff has also allowed researchers to perpetuate Moore’s Law. Named after one of the founders of computer giant Intel Corp., Gordon Moore, the law dictates that the number of transistors which are basically little on/off switches that fit onto an integrated circuit, or computer chip, must double every 18 to 24 months. This decree, made in 1965, has functioned as both a challenge and an inspiration to researchers in private industry as well as academia.
Chips used in most computers today are at the 64-nanometer node, meaning the length of the transistors is 64 nanometers or 64 one-billionths of a meter. Researchers are looking into cost-efficient ways of manufacturing chips on the 45- and 32-nanometer node, employing the current complementary metaloxidesemiconductor architecture.
It’s a task easier said than done, says Nayak. Working with such small components and materials requires extremely delicate and expensive equipment, and industry leaders are increasingly reluctant to invest serious cash into machines to build a working prototype of unproven technology. When you’re working on the molecular scale, Nayak says, there are plenty of unknowns and it’s not uncommon for new, unseen phenomenaeven a single atom out of placeto work as an “X factor” and spoil the entire effort.
While working to shrink the size of semiconductors, researchers are investigating the effectiveness of different materials as components. Cost, however, prevents many ideas from ever making it to the laboratory. Building a computer model of these new chips and chip components, and running them through countless tests that simulate real-world conditions has proven to be far more economical. It saves time and money, and the automotive, aviation, and other industries do extensive modeling before ever building a physical prototype of a new product.
The goal, says Nayak, is to identify test situations that result in consistently predictable results. Once everything is behaving as it should, and reacts to different situations in a predictable manner from the nanoscale to the macroscale then companies begin working on constructing a physical representation of the virtual chip they had been testing.
Of course, keeping track of the thousands of atoms in a nanoscale transistor produces a mass of data. Modeling the entire chip, comprised of billions of atoms, on the macroscale, gives that much more data. Analyzing and collecting this sheer bulk of data, in a manageable time frame, requires a supercomputer. Ironically, supercomputers used in this way are helping design improved machines that in five to 10 years will make today’s even top powerhouses look obsolete.
Nayak has worked on several supercomputers over the years each one speeding up his research. With the CCNI, he expects the pace of his studies to accelerate even more. Currently, Nayak collaborates with IBM and several other Rensselaer researchers on projects in various fields including nanoelectronics, biology, and biocomputation. He expects the presence of CCNI to propel these projects forward at breakneck speed and greatly increase these kinds of beneficial collaborative opportunities.
But supercomputers and their benefits to research are still a work in progress. Mark Shephard, the Samuel A. and Elisabeth C. Johnson Professor of Engineering, researches new and better ways to harness the power of supercomputers like CCNI, along with collaborating on different modeling projects with Rensselaer colleagues. Shephard, who is also director of Rensselaer’s Scientific Computation Research Center, studies automated and adaptive analysis methods, as well as parallel adaptive simulation technologies. Partnering with IBM and the U.S. Department of Energy, along with other corporations and government entities, he’s looking to coax more efficiency out of computing algorithms run on these supercomputing platforms.
The challenge, Shephard says, is to fashion computer simulations in a way that can take advantage of that scalable processing. For example, a number of modeling projects require the many thousands of processors running in parallel to execute in a short enough time for the results to be effectively used. The challenge is keeping each processor busy for the same amount of time. Since various processes might finish at different times, that affects the overall computational power of the system. “With our Blue Gene system of 32,000 processors, even at 95 percent efficiency, there could be on the order of 1,600 processors not being well used. This is clearly an unacceptable situation,” says Shephard.
“Thus, you try to get it as close to perfect as possible and use methods that do not degrade with the dramatic increase in numbers of processors we now need to use,” Shephard says. “Development of these techniques impacts well past what we personally do. They impact scientific discovery and a company’s ability to be competitive and meet customer demand.”
These strict parameters of supercomputing will force Rensselaer researchers to devise new ways of taking advantage of CCNI’s capabilities, says Wolf von Maltzahn, acting vice president for research.
Von Maltzahn points out that CCNI will encourage faculty researchers to think outside the box, collaborate with new partners, and tackle old problems from new angles, stressing that this will not be limited to computer engineers or those working on semiconductors. Any researcher on campus can submit a proposal for time on CCNI, and the applications will be assessed on a case-by-case basis by a steering committee in the process of being formed.