Abstract
The Socratic method, grounded in iterative questioning and critical dialogue, offers a compelling framework for leveraging large language models (LLMs) to advance scientific reasoning and discovery in chemistry and materials science. In this paper, we explore how Socratic principles can be integrated into prompt engineering to enhance model performance by fostering hypothesis refinement, conceptual clarity, and iterative problem-solving in computational and experimental chemistry. By aligning with chain-of-thought techniques, Socratic prompting enables systematic inquiry and deeper engagement with challenges such as defining key scientific concepts, evaluating competing theoretical models, and refining hypotheses through evidence-based reasoning. We also demonstrate how a structured approach that combines multiple Socratic principles enhances the adaptability and rigor of LLM-based scientific problem-solving. Through examples of Socratic prompting applied to chemistry and materials research, we illustrate how this method can refine hypotheses, improve model interpretability, and guide structured scientific reasoning. This work highlights the transformative potential of Socratic prompting as a structured reasoning tool for chemistry and a strategy for leveraging LLMs in scientific research.
Supplementary materials
Title
SI Prompts
Description
This zip file contains the prompts described in the paper and the model's response
Actions