Abstract
LLMs have showcased remarkable capabilities in the realm of AI for Science (Ai4Sci) and the chemistry has greatly benefited from the advancement of AI tools. With a strong capacity for learning sequential data like natural language, LLMs offer immense potential. Notably, common representations in chemistry, such as SMILES, are also in the form of sequences. Hence, we propose leveraging LLMs to comprehensively model both chemical sequences and natural language sequences, aiming to tackle diverse chemical tasks. To fulfill this objective, we introduce BatGPT-Chem, a foundational large-scale model with 15B parameters tailored for chemical engineering. First, we unify diverse tasks in chemistry by modeling them through a combination of natural language and SMILES. Next, leveraging this unified modeling approach, we craft prompt templates and generate instructional tuning data using a substantial volume of chemical data. Subsequently, we train BatGPT-15B on over a hundred million instances of instructional tuning data, empowering it to address tasks such as \textbf{Molecule Description}, \textbf{Molecule Design}, \textbf{Retro-synthesis Prediction}, \textbf{Product Inference}, and \textbf{Yield Prediction}. We release our trial platform at \url{https://www.batgpt.net/dapp/chem}.