Abstract
Hyperparameter optimization using non-linear least-square regression is a common fitting method for data analysis in chemistry, physics, biology, and engineering. It is sometimes challenging to balance various factors such as accuracy, computational time cost, and time cost in setting up the optimization. In this report, I introduce a naïve dynamic grid-searching algorithm named Python jump-chain fitting (PyJCFit) to reduce setting up time costs for a beginner in early-stage optimizations, which combines two different methods, grid and stochastic searching algorithms. The idea is to search all parameters heuristically and sequentially (a vector, not a grid matrix) in trusted bounds with an exponential distribution of space paying attention to the neighbor area of the guessed value. The distribution is somewhat random, and the chain does not require differential equations to optimize. PyJCFit is relatively slow but carries significant advantages in compound equations with breaks, peak searching, and global fitting; and various scoring functions beyond square residual, even those poorly behaved are allowed.