Veterans of artificial intelligence (AI), while they recognize the unexpected and magical side of LLMs, tend to recall that the previous wave, that of machine learning a you deep learning classics, which are powered by data and mathematical models, still retain great importance, particularly in vertical applications: optimization of industrial processes, vision, etc.
Is this the end of Google Search as we know it?
For specialists, two thirds of today’s commercial opportunities still reside in this type of application, behind the wave of hype which brought all the attention to generative AI.
A new architecture
However, AI strong in math and data are preparing their return, with a new architecture, LQM, to Large Quantitative Models.
Here we have the ambition of new foundation models, built on very large sets of data, and primitives which best model various behaviors of the real world, following physical, mathematical and biological laws.
It is expected that these LQMs will be able to considerably advance specific areas, which escaped LLMs: modeling of new materials (for example to design innovative batteries), of biochemical processes (to discover new drugs), etc.
“If we continue to do medicine as we do today, we are heading for disaster. AI will help deal with the grandpa boom”
We knew it: LLMs understand the real world quite poorly (through language only), LQMs bring the power of these architectures to an understanding of the physical and biological world.