Editors: Kristof T. Schütt, Stefan Chmiela, Anatole von Lilienfeld, Alexandre Tkatchenko, Koji Tsuda, Klaus-Robert Müller
The upcoming book covers the topics of the IPAM long program "Understanding Many-Particle Systems with Machine Learning" and our recently organized NIPS workshop "Machine Learning for Molecules and Materials". It will be composed of technical parts covering representations for molecules and materials, ML across chemical compound space and ML for potential energy surfaces.
Deadline: August 31, 2018
|[contribution of user mueller will appear here: Introduction to kernel learning]|
|[contribution of user montavon will appear here: Introduction to neural networks]|
|Kieron Burke: Finding density functionals with machine learning|
We review progress toward using machine learning to find density functionals.
|Gábor Csányi, Michael Willatt, Michele Ceriotti: Machine-learning of atomic-scale properties based on physical principles|
We briefly summarize the kernel regression approach, as used recently in materials modelling, to fitting functions, particularly potential energy surfaces, and highlight how the linear algebra framework can be used to both predict and train from linear functionals of the potential energy, such as the total energy and atomic forces. We then give a detailed account of the Smooth Overlap of Atomic Positions (SOAP) representation and kernel, showing how it arises from an abstract representation of smooth atomic densities, and how it is related to several popular density-based representations of atomic structure. We also discuss recent generalisations that allow fine control of correlations between different atomic species, prediction and fitting of tensorial properties, and also the how to construct structural kernels---applicable to comparing entire molecules or periodic systems---that go beyond an additive combination of local environments.
|Matti Hellström and Jörg Behler: High-Dimensional Neural Network Potentials for Atomistic Simulations|
High-dimensional neural network potentials, proposed by Behler and Parrinello in 2007, have become an established method to calculate potential energy surfaces with first-principles accuracy at a fraction of the computational costs. The method is general and can describe all types of chemical interactions (e.g. covalent, metallic, hydrogen bonding, and dispersion) for the entire periodic table, including chemical reactions, in which bonds break or form. Typically, many-body atom-centered symmetry functions, which incorporate the translational, rotational and permutational invariances of the potential-energy surface exactly, are used as descriptors for the atomic environments. This chapter describes how such symmetry functions and high-dimensional neural network potentials are constructed and validated.
|: Covariant neural network architectures|
|[contribution of user rupp will appear here: MBTR]|
|[contribution of user faber will appear here: SLATM]|
|[contribution of user gilmer will appear here: Message-passing neural networks]|
|[contribution of user chmiela will appear here: Representations, domain knowledge, invariances]|
|Kristof T. Schütt, Alexandre Tkatchenko, Klaus-Robert Müller: Learning representations of molecules and materials with atomistic neural networks|
Deep Learning has been shown to learn efficient representations for structured data such as image, text or audio. In this chapter, we present neural network architectures that are able to learn efficient representations of molecules and materials. In particular, the continuous-filter convolutional network SchNet accurately predicts chemical properties across compositional and configurational space on a variety of datasets. Beyond that, we analyze the obtained representations to find evidence that their spatial and chemical properties agree with chemical intuition.
|: Machine Learning for Molecular Dynamics|
|Aldo Glielmo, Claudio Zeni, Ádám Fekete, Alessandro De Vita: |
|Rodrigo Vargas-Hernandez: Gaussian Process Regression for Extrapolation of Properties of Complex Quantum Systems across Quantum Phase Transitions|
|[contribution of user vogt-moranto will appear here: *TBA*]|
|[contribution of user sauceda will appear here: GDML]|
|Michael Gastegger, Philipp Marquetand: Molecular dynamics with neural-network potentials|
Molecular dynamics simulations are an important tool for describing the evolution of a chemical system with time. However, these simulations are inherently held back either by the prohibitive cost of accurate electronic structure theory computations or the limited accuracy of classical empirical force fields. Machine learning techniques can help to overcome these limitations by providing access to potential energies, forces and other molecular properties modeled directly after an accurate electronic structure reference at only a fraction of the original computational cost. The present text discusses several practical aspects of conducting machine learning driven molecular dynamics simulations. First, we study the efficient selection of reference data points on the basis of an active learning inspired adaptive sampling scheme. This is followed by the analysis of a machine-learning based model for simulating molecular dipole moments in the framework of predicting infrared spectra via molecular dynamics simulations. Finally, we show that machine learning models can offer valuable aid in understanding chemical systems beyond a simple prediction of quantities.
|[contribution of user hart will appear here: Cluster expansion]|
|[contribution of user isayev will appear here: ANI]|
|[contribution of user ramprasad will appear here: Polymer design]|
|Daniel Schwalbe Koda; Rafael Gómez-Bombarelli: Generative Models for Automatic Chemical Design|
Materials discovery is decisive for tackling urgent challenges related to energy, the environment or health care. In chemistry, conventional methodologies for innovation usually rely on expensive and incremental optimization strategies. Building a reliable mapping between structures and properties enables navigating molecular space efficiently and much more rapid design and optimization of novel useful compounds. In this chapter, we review how current deep learning and generative models address this inverse chemical design paradigm. We begin introducing generative models in deep learning and categorizing them according to their architecture and molecular representation. The evolution and performance of popular chemical generative models in the literature are then reviewed. Finally, we highlight the prospects and challenges of the automatic chemical design as a cutting edge tool in materials development and technological progress.
|[contribution of user armiento will appear here: Crystal predictions]|
|[contribution of user huang will appear here: AMONS]|
|[contribution of user tsuda will appear here: Automatic complex materials design]|
|Atsuto Seko and Hiroyuki Hayashi: Recommender systems for the materials discovery|
|Giuseppe Carleo: Neural-Network Quantum States|
|[contribution of user tkachenko will appear here: QuantumMachine.org]|
|[contribution of user lilienfeld will appear here: QuantumMachine.org]|
The following page provides information and the latex style files to prepare the chapters for LNP: