Just a Standard Blog
The creation of a new material has long been either an accident or a matter of trial and error. Steel, for instance, was developed over hundreds of years by people who didn’t know why what they were doing worked (or didn’t work). Generations of blacksmiths observed that iron forged in charcoal was stronger than iron that wasn’t, and iron that was forged in a very high-temperature, charcoal-fired furnace and rapidly cooled was even stronger, and so on.
While we’re still learning things about steel, we now have all kinds of recipes that we can use to make steels with different properties depending on the application, but those recipes took a lot of time, sweat and toil to develop. Wouldn’t it be great if we could skip over all the trials and errors and design new materials from scratch with the exact properties we want?
Fully realized, the Materials Genome Initiative (MGI) would let us do just that.
If Scotty had access to a materials genome in "Star Trek IV: The Voyage Home," he wouldn’t have had to trade the formula for transparent aluminum to get the acrylic glass he needed to build a tank large enough to store a pair of humpback whales in the cargo hold (and thus save the Earth from being destroyed by whale-loving aliens).
No, he could have just told the ship’s computer that he needed a transparent, watertight material that was stronger than tritanium. The computer, having a detailed database of the properties of various elements and compounds, could predict what properties various combinations of those constituents would have and develop a formula for a new material with the desired properties.
(Then it would have just been a matter of gathering the raw materials, so maybe he would have had to trade the formula for transparent aluminum after all.)
Admittedly, all that is probably some ways off. In the shorter term, the MGI is working to create what President Obama called a “materials innovation infrastructure.” In the past five years since its inception—we just celebrated our anniversary at the White House a few weeks ago—we’ve built an integrated platform to bring together computation, theory and experiment, and enable the broad sharing of materials data and software.
Researchers across the country in government, university and industry labs can use these tools to design and run smarter simulations, speeding the development and deployment of materials with whatever properties they seek. Some of the applications being pursued include heat-sinking electronics, long-lived batteries, better body armor, and auto bodies strong enough to save your life and light enough to save you gas.
When I first came to NIST, I was fortunate to find myself among a number of other National Research Council postdoctoral researchers who shared an interest in computational materials science, a field that was just beginning to take off. Indeed, NIST was a great place to do such work, with NIST senior fellow John W. Cahn (as well as many other outstanding scientists) creating an intellectual environment that valued serious theoretical work that could have a real engineering impact.
In 1994, my colleagues and I co-founded the NIST Center for Theoretical and Computational Materials Science (CTCMS). Empowered by the newly created World Wide Web, our mission was to provide the basic data and computational tools that industry needed to “get over the hump” and begin their own materials research. Now in its 22nd (!) year, the center is a source for numerous software tools in wide use by both academic and industrial materials researchers.
While the CTCMS was, and is, something we’re proud of, it was tough starting out. We learned that scientists simply didn’t have much incentive to publish their simulation code; they’d much rather publish the findings they made with the code. What was not widely accepted was that, in general, a well-written simulation code can be at least as, if not more, valuable than a paper in a journal. After all, a paper furthers knowledge in one area, but other researchers can pick up and adapt simulation software to study all kinds of different things.
While that sounds great in theory, in practice, scientists can be a competitive bunch. Sharing code that you worked so hard to develop risks giving your competitors a leg up, and what’s worse, they often didn’t even give you any credit when they used it! Thankfully, scientists and their patrons have begun asking that data and code be shared and demanding that data and code be cited when used by others.
Changing the culture of scientific prestige is not easy, but it is now happening, and, of course, this kind of data/code sharing is an integral part of the MGI’s DNA.
Luckily, NIST management valued and continues to value our output of simulation codes as a class “A” research product.
But we were ahead of the curve back then.
If we jump forward 15 years to 2010, we find that code-sharing is now commonplace, e.g., the open-source community, but still not particularly encouraged or rewarded within the wider research community, as the risks are still largely viewed as outweighing the rewards. Fortuitously, that year the Office of Science and Technology Policy put out a call to federal researchers to participate in formulating a materials modeling and simulation initiative. Chuck Romine and I were selected to serve on the National Science and Technology Council committee that drafted the white paper (PDF) that resulted in the creation of the MGI. Subsequently, I became, and remain, the executive secretary of the MGI subcommittee.
In the nearly five years since the initiative’s inception it has grown to more than $25 million in annual funding, including the Chicago-based Center for Hierarchical Materials Design (CHiMaD), a NIST Center of Excellence. ChiMaD specifically supports NIST’s MGI activities to make the exchange of data and models easier, ensure the quality of data and models, and develop a conduit through which new methods and measurements can flow from the infrastructure as it matures.
Simultaneously, NIST has established its Office of Data and Informatics to focus on collecting the highest quality reference data and developing and disseminating best practices in data science. At this time, we are deploying a materials data repository, a materials data curation system, and, to make discovery easier, a materials resource registry.
One of the ways we know we’re making scientific progress is when we can extend the frontiers of an existing model to describe something new. Even better is when an old model falls in favor of an improved description of reality. The proliferation of data promises to usher in a “4th paradigm” of data-driven materials science where the discovery of new applications and new materials will become a daily occurrence.
By integrating experiment and computation more tightly and organizing and making the results of each more readily available, the MGI is poised to be an integral part of this new paradigm.
It’s an exciting time to be a materials scientist, and I’m thrilled to be a part of the coming scientific revolution in how we discover new materials. Now, if you’ll excuse me, I need to go tinker with my tritanium recipe.