Materials’ microscopic structures and properties are inextricably linked, making customization difficult. Rice University engineers are determined to use machine learning to simplify the process.
To that end, Ming Tang’s Rice lab, in collaboration with physicist Fei Zhou at Lawrence Livermore National Laboratory, developed a technique to predict the evolution of microstructures in materials (structural features between 10 nanometers and 100 microns).
Their open-access paper in the Cell Press journal Patterns demonstrates how neural networks (computer models that mimic the brain’s neurons) can be trained to predict how a structure will grow in a given environment, similar to how a snowflake forms in nature from moisture.
In fact, one of the examples used in the lab’s proof-of-concept study was snowflake-like dendritic crystal structures.
“In modern material science, it’s widely accepted that the microstructure often plays a critical role in controlling a material’s properties,” Tang said. “You want to be able to control not only how the atoms are arranged on lattices, but also the microstructure, in order to achieve good performance and even new functionality.
“The holy grail of designing materials is to be able to predict how a microstructure will change under given conditions, whether we heat it up or apply stress or some other type of stimulation,” he says.
Tang has spent his entire career working to improve microstructure prediction, but he believes the traditional equation-based approach will struggle to keep up with the demand for new materials.
“The tremendous progress in machine learning encouraged Fei at Lawrence Livermore and us to see if we could apply it to materials,” he explained.
Fortunately, there was plenty of data from the traditional method to help train the team’s neural networks, which use early microstructure evolution to predict the next step, and the next one, and so on.
“This is what machinery is good at, seeing the correlation in a very complex way that the human mind is not able to,” Tang explained. “We take advantage of that.”
The neural networks were tested on four different types of microstructures: plane-wave propagation, grain growth, spinodal decomposition, and dendritic crystal growth.
The networks were fed between 1,000 and 2,000 sets of 20 successive images illustrating a material’s microstructure evolution as predicted by the equations in each test. Following the learning of the evolution rules from these data, the network was given 1 to 10 images to predict the next 50 to 200 frames, which it usually did in seconds.
The benefits of the new technique were quickly apparent: When compared to the previous algorithm, the neural networks powered by graphic processors sped up the computations for grain growth by up to 718 times. They were still up to 87 times faster than the old method when run on a standard central processor. Other types of microstructure evolution predicted similar, albeit less dramatic, speed increases.
Tang stated that comparisons with images from the traditional simulation method demonstrated that the predictions were mostly correct. “Based on that, we see how we can update the parameters to make the prediction more and more accurate,” he said. “Then we can use these predictions to help design previously unseen materials.
“Another benefit is that it’s able to make predictions even when we do not know everything about the material properties in a system,” Tang added. “We couldn’t do that with the equation-based method, which needs to know all the parameter values in the equations to perform simulations.”
Tang believes that the computational efficiency of neural networks could hasten the development of new materials. He anticipates that this will be useful in his lab’s ongoing development of more efficient batteries. “We’re thinking about novel three-dimensional structures that will help charge and discharge batteries much faster than what we have now,” Tang explained. “This is an optimization problem that is perfect for our new approach.”