Abstract
The proper balancing of information from experiment and theory is a long-standing problem in the analysis of noisy and incomplete data. Viewed as a Pareto optimization problem, improved agreement with the experimental data comes at the expense of growing inconsistencies with the theoretical reference model. Here, we propose how to set the exchange rate a priori to properly balance this trade-off. We focus on gentle ensemble refinement, where the difference between the potential energy surfaces of the reference and refined models is small on a thermal scale. By relating the variance of this energy difference to the Kullback-Leibler divergence between the respective Boltzmann distributions, one can encode prior knowledge about energy uncertainties, i.e., force-field errors, in the exchange rate. We highlight the relation of gentle refinement to free energy perturbation theory. A balanced encoding of prior knowledge increases the quality and transparency of ensemble refinement. Our findings extend to non-Boltzmann distributions, where the uncertainty in energy becomes an uncertainty in information.