Abstract
Autonomous laboratories were previously controlled mainly by scripting languages such as Python, limiting their usage among experimentalists. The recent release of OpenAI's ChatGPT API's function calling feature has enabled seamless integration and execution of Python subroutines in experimental workflows using voice commands. We have developed a system of Copilot for Real-world Experimental Scientist (CRESt) system, with a demonstration shown on YouTube. Large language models (LLMs) empower all research group members, regardless of coding experience, to leverage the robotic platform for their own projects, simply by talking with CRESt.
Supplementary weblinks
Title
CRESt - Copilot for Real-world Experimental Scientists
Description
A video demo of the CRESt system on YouTube
Actions
View Title
ChatGPT-4v + Meta smart glasses helping researchers analyzing experiments
Description
This work is a grafting of OpenAI’s powerful ChatGPT-4v model over Meta’s latest smart glasses
Actions
View Title
AI's First Adventure into the Microscopic World - CRESt Autonomous SEM
Description
This work integrates the latest ChatGPT-4V API onto a python script-driven Scanning Electron Microscope (SEM) developed by Thermo Fisher Nanoscience Instruments. Leveraging the gpt-4-1106-preview model, the SEM agent can autonomously operate the SEM guided by the image info extracted by a visual agent powered by gpt-4-vision-preview model, and eventually fulfilling the request raised by human researchers. This marks a pioneering venture where AI delves into the microscopic world, down to 2 nanometer imaging resolution of the SEM, armed with EDX elemental composition mapping.
Actions
View