Abstract
Recently there has been growing interest in implementing the high-throughput approach to access the dynamics of chemical processes across different fields. With an ever-increasing amount of data catalyzed by high-throughput experimentation, the development of fully-integrated workflows becomes crucial. These workflows should combine novel experimental tools and interpretation methods to convert the data into valuable information. To design feasible data-driven workflows it is necessary to estimate the value of information and balance it with the number of experiments and resources required. Basing this kind of workflow on actual physical models appears to be a more feasible strategy as compared to data-extensive empirical statistical methods. Here we show an algorithm that constructs and evaluates kinetic models of different complexity. The algorithm facilitates the evaluation of the experimental data quality and quantity requirements needed for reliable discovery of the rates driving the corresponding chemical models. The influence of the quality and quantity of data on the obtained results was indicated by the accuracy of the estimates of the kinetic parameters. We also show that this method can be used to find correct reaction scenarios directly from simulated kinetic data with little to no overfitting. Well-fitting models for theoretical data can then be used as a proxy for optimizing the underlying chemical systems. Taking real physical effects into account, this approach goes beyond: we show that with the kinetic models one can make a direct, unbiased, quantitative connection between kinetic data and the reaction scenario.
Supplementary materials
Title
Supporting Information
Description
The supporting information contains:
-Examples of kinetic parameter derivation.
-Validation that the method works across many different chemistries and chemical phenomena.
-Validation of the accuracy of the method
-Information on the measurement of the error used in the paper.
Actions