Machine learning plastic deformation

Go to the profile of Lasse Laurson
Dec 13, 2018
0
0

Perhaps the most intriguing phenomenon in plastic deformation of crystalline solids is that as the sample size is decreased towards the scale of micrometers, the smooth and orderly fashion in which macroscopic samples stretch and bend gradually becomes increasingly erratic and irregular. This is due to strain bursts originating from collective dynamics of dislocations, the line-like topological lattice defects whose stress-driven motion gives rise to plastic, irreversible deformation. These bursts appear largely randomly during the deformation process, and have a broad size distribution, reminiscent of critical phenomena. This implies that when performing a set of deformation experiments on micron-scale samples under nominally the same conditions, one typically obtains a bunch of erratic-looking stress-strain curves (describing how much stress is needed to reach a given strain) with large variations from sample to sample.

Some time ago we started wondering if it would be possible to predict how a specific micron-scale sample will deform, given that its initial dislocation microstructure is known to some degree. What makes this challenging is the apparently random nature of the deformation bursts. On the other hand, dislocation dynamics should follow deterministic equations of motion, where the instantaneous velocity of a dislocation is given by the total stress acting on it. Hence, neglecting possible complications that might arise from randomness due to thermal noise, or the possibility of dislocation dynamics being chaotic, the initial state of the sample should, in principle, completely determine its response to applied stresses.

For a human, figuring out the complex, non-linear mapping from the initial microstructure to the stress-strain curve is not really feasible – at least none of us could tell by just looking at a dislocation configuration how it would deform once external stresses are applied. However, it turned out that we can use machine learning (ML) to do so with a surprisingly good accuracy.  We simulated a dislocation dynamics model to generate a large number of artificial stress-strain curves, each corresponding to a unique, randomly generated initial dislocation configuration. Then we used these data to train a neural network to infer a mapping from a set of features of the initial states to the stress-strain curve.

The next step was to test how well the network had learned the deformation dynamics, and interpret the result as a measure of deformation predictability. The first surprise was that although deformation predictability first deteriorated with increasing strain – presumably due to the onset of strain burst activity, it started to improve towards the end of the deformation process. Moreover, we found an interesting size effect: larger systems are more predictable.

It should be emphasised that these results discussed in our paper really rely on applying ML, and could not have been obtained using traditional methods. In our opinion, this nicely illustrates the usefulness of ML in a wide range of fields including (but not limited to) predictability and optimisation of materials deformation, and should become part of the toolbox of many more scientists and engineers working on materials physics and beyond.

Go to the profile of Lasse Laurson

Lasse Laurson

Associate Professor (tenure track), Tampere University of Technology

No comments yet.