Google DeepMind has used artificial intelligence $(AI)$ to predict the structure of more than 2 million new materials, a breakthrough it said could soon be used to improve real-world technologies.
In a research paper published in science journal Nature on Wednesday, the Alphabet -owned AI firm said almost 400,000 of its hypothetical material designs could soon be produced in lab conditions.
Potential applications for the research include the production of better-performing batteries, solar panels and computer chips.
The discovery and synthesis of new materials can be a costly and time-consuming process. For example, it took around two decades of research before lithium-ion batteries – today used to power everything from phones and laptops to electric vehicles – were made commercially available.
“We're hoping that big improvements in experimentation, autonomous synthesis, and machine learning models will significantly shorten that 10 to 20-year timeline to something that's much more manageable,” said Ekin Dogus Cubuk, a research scientist at DeepMind.
DeepMind’s AI was trained on data from the Materials Project, an international research group founded at the Lawrence Berkeley National Laboratory in 2011, made up of existing research of around 50,000 already-known materials.
The company said it would now share its data with the research community, in the hopes of accelerating further breakthroughs in material discovery.
"Industry tends to be a little risk-averse when it comes to cost increases, and new materials typically take a bit of time before they become cost-effective," said Kristin Persson, director of the Materials Project.
"If we can shrink that even a bit more, it would be considered a real breakthrough."
Having used AI to predict the stability of these new materials, DeepMind said it would now turn its focus to predicting how easily they can be synthesised in the lab.