MOFTransformer: A Multi-modal Pre-training Transformer for Universal Transfer Learning in Metal-Organic Frameworks

20 October 2022, Version 1
This content is a preprint and has not undergone peer review at the time of posting.

Abstract

In this work, we introduce MOFTransformer, a multi-model Transformer encoder pre-trained with 1 million hypothetical MOFs. The multi-modal model uses an integrated atom-based graph and energy-grid embeddings to capture both the local and global features of the MOFs, respectively. By fine-tuning the pre-trained model with small datasets (from 5,000 to 20,000), our model outperforms all other machine learning models across various properties that include gas adsorption, diffusion, electronic properties, and even text mined data. Beyond its universal transfer learning capabilities, MOFTransformer generates chemical insight by analyzing feature importance from attention scores within the self-attention layers. As such, this model can serve as a bedrock platform for other MOF researchers that seek to develop new machine learning models for their work.

Keywords

MOFs
metal-organic framework
transformer
transfer learning
multi-modal
porous material
machine learning
universal transfer learning

Supplementary materials

Title
Description
Actions
Title
Supplementary Information
Description
Supplementary Notes 1-5, Figures 1-12, Tables 1 and reference.
Actions

Supplementary weblinks

Comments

Comments are not moderated before they are posted, but they can be removed by the site moderators if they are found to be in contravention of our Commenting Policy [opens in a new tab] - please read this policy before you post. Comments should be used for scholarly discussion of the content in question. You can find more information about how to use the commenting feature here [opens in a new tab] .
This site is protected by reCAPTCHA and the Google Privacy Policy [opens in a new tab] and Terms of Service [opens in a new tab] apply.