Peter Thiel once said, "We wanted flying cars, instead we got 140 characters (Twitter)." Over the past decade, we have made great strides at the bit level (internet), but progress at the atomic level (cutting-edge technology) has been relatively slow.
The accumulation of linguistic data has propelled the development of machine learning and ultimately led to the emergence of Large Language Models (LLMs). With the push from AI, progress at the atomic level is also accelerating. Methods like Deep Potential, by learning quantum mechanical data, have increased the space-time scale of microscopic simulations by several orders of magnitude and have made significant progress in fields like drug design, material design, and chemical engineering.
The accumulation of quantum mechanical data is gradually covering the entire periodic table, and the Deep Potential team has also begun the practice of the DPA pre-training model. Analogous to the progress of LLMs, we are on the eve of the emergence of a general Large Atom Model (LAM). At the same time, we believe that open-source and openness will play an increasingly important role in the development of LAM.