Abstract
Deep neural networks have become popular model architectures for fitting coarse-grained molecular dynamics potentials (CGMD) owing to their ability to describe complex features and ease of training against large datasets. However, such architectures are much more complicated than traditional functional forms. This raises the question of whether a similar data-driven approach that used a simpler functional form would have any advantages. Here, we develop a genetic algorithm that optimizes Lennard-Jones potentials for coarse-grained models of various crystal and liquid crystal forming materials based on both structural and thermodynamic information. A detailed description of the genetic algorithm, its loss function, and hyperparameters is presented. The models developed by the algorithm reproduce a much broader range of physical properties than achieved by simple functional forms parameterized with traditional algorithms. The models also show surprising transferability in reproducing properties that were not directly trained against. Simulations of larger systems with these models demonstrate stabilization of the reference crystal structure on long time scales, preserve melting-point trends, and reproduce liquid crystalline phase transitions, despite this information being absent from training. Despite the rush to adopt neural network potentials, these case-studies show that simpler functional forms retain untapped potential for CGMD when coupled with data-driven training algorithms.
Supplementary materials
Title
Supporting Information
Description
Contains additional details on the genetic algorithm and supporting data figures that are referenced in the main text.
Actions