Abstract
In this work, we explore how existing datasets of quantum chemical properties can be repurposed to
build data-efficient downstream machine learning models, with a particular focus on predicting the
activation energy of hydrogen atom transfer (HAT) reactions. Starting from a valence bond (VB)
analysis of a generic HAT process, a set of informative descriptors is identified. Next, a surrogate
neural network model is constructed to predict an informative representation, based on the identified
VB descriptors, with the help of a publicly available dataset of (pre-computed) quantum chemical
properties of organic radicals. We demonstrate that coupling the resulting on-the-fly informative
representation to a secondary machine-learning model for activation energy prediction outperforms
various predictive model architectures starting from conventional machine-learning inputs by a wide
margin, at no additional computational cost. As a bonus, by basing their final predictions on
physically meaningful descriptors, our models become inherently interpretable. Finally, because of
the extreme data efficiency of our descriptor-augmented models, we are able to fine-tune and apply
them to small datasets across various reaction conditions, settings and application domains, ranging
from regular (liquid phase) synthesis, over metabolism and drug design, to atmospheric chemistry.
Supplementary materials
Title
Supporting Information
Description
This is the supporting Information file.
Actions