Post by bracknellboy on Jan 10, 2024 18:57:09 GMT
On the other hand,
"...used AI and supercomputers to narrow down 32 million potential inorganic materials to 18 promising candidates in less than a week - a screening process that could have taken more than two decades to carry out using traditional lab research methods.
So saying it "just" sped things up is a bit like saying the steam engine "just" sped things up, or "just" reduced the manpower required. Which in most circumstances amount to the same thing. I'd suggest the 'gain' is a bit more than just a bit of a speed up.
What is not said in there is how these 'potential' materials were known/discovered. The article doesn't go into any details. But if I was to hazard a wild guess, I'm thinking it is fairly likely that the AI contribution here was something like:
- train the model (or get it to teach itself) to understand what can make a good battery material
- train the model (or get it to teach itself) about molecular structures/shapes/chemical behaviours to understand what would happen if you doped this with that/combined the other with this
- get it to postulate and synthesise new materials (model wise) based on this knowledge
- and a shed load more that I'm even less qualified to postulate and prattle on about
- analyse the resulting materials to determine whether probablistically they were good/viable candidates or not
In other words it created novel materials based on knowledge gained, and "tested" whether they were likely good candidates or not. To get a short list to be presented to real world actors
So sure, it hasn't lit any physical bunsen burners. But if the 'gain' is a couple of decades compressed into (less than) a week, that is really not much different from saying it has done something which may have have otherwise been done. And this might be a dud of course. But hell, there have been plenty of those over the years. Fail fast, just a shed load faster, learn and improve.