Data Efficiency of Classification Strategies for Chemical and Materials Design

11 November 2024, Version 3
This content is a preprint and has not undergone peer review at the time of posting.

Abstract

Active learning and design-build-test-learn strategies are increasingly employed to accelerate materials discovery and characterization. Many data-driven materials design campaigns target solutions within constrained domains such as synthesizability, stability, solubility, recyclability, and toxicity. Lack of knowledge about these constraints can hinder design efficiency by producing samples that fail to meet required thresholds. Acquiring this knowledge during the design campaign is inefficient, and effective classification of common materials constraints transcends specific design objectives. However, there is no consensus on the most data-efficient algorithm for classifying whether a material satisfies a constraint. To address this gap, we comprehensively compare the performance of 100 strategies designed to classify chemical and materials behavior. Performance is assessed across 31 classification tasks sourced from the literature in chemical and materials science. From these results, we recommend best practices for building data-efficient classifiers, showing the neural network- and random forest-based active learning algorithms are most efficient across tasks. We also show that classification task complexity can be quantified based on task metafeatures, most notably the noise-to-signal ratio. Overall, this work provides a comprehensive survey of data-efficient classification strategies, identifies attributes of top-performing strategies, and suggests avenues for further study.

Keywords

active learning
bayesian optimization
design of experiments
optimization
Design-build-test-learn
phase behavior

Supplementary materials

Title
Description
Actions
Title
Supporting Information
Description
Model Implementation and Hyperparameter Tuning; Sensitivity of Top Algorithms to Chosen Tasks and Number of Points; Sensitivity of Active Learning vs. Space-Filling to Chosen Tasks; Sensitivity of Sampler Performance to Number of Points; Survey of Ensemble Strategies
Actions

Comments

Comments are not moderated before they are posted, but they can be removed by the site moderators if they are found to be in contravention of our Commenting Policy [opens in a new tab] - please read this policy before you post. Comments should be used for scholarly discussion of the content in question. You can find more information about how to use the commenting feature here [opens in a new tab] .
This site is protected by reCAPTCHA and the Google Privacy Policy [opens in a new tab] and Terms of Service [opens in a new tab] apply.