Difference between revisions of "Journal:An open experimental database for exploring inorganic materials"

From LIMSWiki
Jump to navigationJump to search
(Created stub. Saving and adding more.)
 
(Saving and adding more.)
Line 6: Line 6:
|title_full  = An open experimental database for exploring inorganic materials
|title_full  = An open experimental database for exploring inorganic materials
|journal      = ''Scientific Data''
|journal      = ''Scientific Data''
|authors      = Zakutayev, Andriy; Wunder, Nick; Schwarting, Marcus; Perkins, John D.; White, Robert; Munch, Kristin; Tumas, William; Phillips, Caleb
|authors      = Zakutayev, Andriy; Wunder, Nick; Schwarting, Marcus; Perkins, John D.;<br />White, Robert; Munch, Kristin; Tumas, William; Phillips, Caleb
|affiliations = National Renewable Energy Laboratory
|affiliations = National Renewable Energy Laboratory
|contact      = Email: See original article
|contact      = Email: See original article
Line 31: Line 31:
==Introduction==
==Introduction==
Machine learning is a branch of computer science concerned with algorithms that can develop models from the available data, reveal trends and correlation in this data, and make predictions about unavailable data. The predictions rely on data mining, the process of discovering patterns in large data sets using statistical methods. Machine learning methods have been recently successful in process automation, natural language processing, and computer vision, where large databases are available to support data-driven modeling efforts. These successes also sparked discussions about the potential of [[artificial intelligence]] in science<ref name="AppenzellerTheAI17">{{cite web |url=http://www.sciencemag.org/news/2017/07/ai-revolution-science |title=The AI revolution in science |work=Science |date=07 July 2017 |doi=10.1126/science.aan7064}}</ref> and the Fourth Paradigm<ref name="HeyTheFourth09">{{cite book |title=The Fourth Paradigm: Data-Intensive Scientific Discovery |author=Hey, T.; Tansley, S.; Tolle, K. |publisher=Microsoft Research |year=2009 |isbn=9780982544204 |url=https://www.microsoft.com/en-us/research/publication/fourth-paradigm-data-intensive-scientific-discovery/}}</ref> of data-driven scientific discovery. In materials science, applying artificial intelligence to data-driven materials discovery is important, because new materials often underpin major advances in modern technologies. For example, in advanced energy technologies, efficient solid state lighting was enabled by the use of gallium nitride in light-emitting diodes, electric cars were brought to life by intercalation materials used in lithium-ion batteries, and modern computers would not have been possible without the material silicon.
Machine learning is a branch of computer science concerned with algorithms that can develop models from the available data, reveal trends and correlation in this data, and make predictions about unavailable data. The predictions rely on data mining, the process of discovering patterns in large data sets using statistical methods. Machine learning methods have been recently successful in process automation, natural language processing, and computer vision, where large databases are available to support data-driven modeling efforts. These successes also sparked discussions about the potential of [[artificial intelligence]] in science<ref name="AppenzellerTheAI17">{{cite web |url=http://www.sciencemag.org/news/2017/07/ai-revolution-science |title=The AI revolution in science |work=Science |date=07 July 2017 |doi=10.1126/science.aan7064}}</ref> and the Fourth Paradigm<ref name="HeyTheFourth09">{{cite book |title=The Fourth Paradigm: Data-Intensive Scientific Discovery |author=Hey, T.; Tansley, S.; Tolle, K. |publisher=Microsoft Research |year=2009 |isbn=9780982544204 |url=https://www.microsoft.com/en-us/research/publication/fourth-paradigm-data-intensive-scientific-discovery/}}</ref> of data-driven scientific discovery. In materials science, applying artificial intelligence to data-driven materials discovery is important, because new materials often underpin major advances in modern technologies. For example, in advanced energy technologies, efficient solid state lighting was enabled by the use of gallium nitride in light-emitting diodes, electric cars were brought to life by intercalation materials used in lithium-ion batteries, and modern computers would not have been possible without the material silicon.
In computational materials science<ref name="NosengoTheMaterial16">{{cite journal |title=The Material Code: Machine-learning techniques could revolutionize how materials science is done |journal=Nature |author=Nosengo, N. |volume=533 |pages=22–25 |year=2016 |doi=10.1038/533022a}}</ref>, machine learning methods have been recently used to predict structure<ref name="MeredigCombin14">{{cite journal |title=Combinatorial screening for new materials in unconstrained composition space with machine learning |journal=Physical Review B |author=Meredig, B.; Agrawal, A.; Kirklin, S. et al. |volume=89 |pages=094104 |year=2014 |doi=10.1103/PhysRevB.89.094104}}</ref>, stability<ref name="HautierFinding10">{{cite journal |title=Finding Nature’s Missing Ternary Oxide Compounds Using Machine Learning and Density Functional Theory |journal=Chemistry of Materials |author=Hautier, G.; Fischer, C.C.; Jain, A. et al. |volume=22 |issue=12 |pages=3762–3767 |year=2010 |doi=10.1021/cm100795d}}</ref>, and properties<ref name="CarreteFinding14">{{cite journal |title=Finding Unprecedentedly Low-Thermal-Conductivity Half-Heusler Semiconductors via High-Throughput Materials Modeling |journal=Physical Review X |author=Carrete, J.; Li, W.; Mingo, N. et al. |volume=4 |pages=011019 |year=2014 |doi=10.1103/PhysRevX.4.011019}}</ref> of inorganic solid state materials. These results had been enabled by advances in simulation tools at multiple length scales.<ref name="RajanInform13">{{cite book |title=Informatics for Materials Science and Engineering |editor=Rajan, K. |publisher=Butterworth-Heinemann |edition=1st |pages=542 |year=2013 |isbn=9780123946140}}</ref> The resulting simulated materials data are stored in ever-growing publicly-accessible computational property databases.<ref name="JainComment13">{{cite journal |title=Commentary: The Materials Project: A materials genome approach to accelerating materials innovation |journal=APL Materials |author=Jain, A.; Ong, S.P.; Hautier, G. et al. |volume=1 |issue=1 |pages=011002 |year=2013 |doi=10.1063/1.4812323}}</ref><ref name="CurtaroloAFLOWLIB12">{{cite journal |title=AFLOWLIB.ORG: A distributed materials properties repository from high-throughput ''ab initio'' calculations |journal=Computational Materials Science |author=Curtarolo, S.; Setyawan, W.; Wang, S. et al. |volume=58 |pages=237–235 |year=2012 |doi=10.1016/j.commatsci.2012.02.002}}</ref><ref name="SaalMaterials13">{{cite journal |title=Materials Design and Discovery with High-Throughput Density Functional Theory: The Open Quantum Materials Database (OQMD) |journal=JOM |author=Saal, J.E.; Kirklin, S.; Aykol, M. et al. |volume=65 |issue=11 |pages=1501–1509 |year=2013 |doi=10.1007/s11837-013-0755-4}}</ref> In contrast to computations, experimental materials discovery using machine learning is limited by the dearth of large and diverse datasets (Fig. 1). Large experimental datasets like the Inorganic Crystal Structure Database (ICSD)<ref name="BelskyNew02">{{cite journal |title=New developments in the Inorganic Crystal Structure Database (ICSD): Accessibility in support of materials research and design |journal=Acta Crystallographica B |author=Belsky, A.; Hellenbrandt, M.; Karen, V.L.; Luksch, P. |volume=58 |issue=Pt 3 Pt 1 |pages=364–9 |year=2002 |pmid=12037357}}</ref> contain hundreds of thousands of entries but are not diverse enough, as they contain only composition and structure of the materials. The diverse datasets like Landolt–Börnstein (http://materials.springer.com/)<ref name="HellwegeLandolt67">{{cite journal |title=Landolt-Börnstein, Numerical Data and Functional Relationships in Science and Technology |journal=American Journal of Physics |author=Hellwege, K.H. |volume=35 |issue=3 |pages=291–292 |year=1967 |doi=10.1119/1.1974060}}</ref> or AtomWorks (http://crystdb.nims.go.jp/index_en.html)<ref name="XuInorganic11">{{cite journal |title=Inorganic Materials Database for Exploring the Nature of Material |journal=Japanese Journal of Applied Physics |author=Xu, Y.; Yamazaki, M.; Villars, P. |volume=50 |pages=11S |year=2011 |doi=10.1143/JJAP.50.11RH02}}</ref> contain hundreds to thousands of entries for different properties, so they are not large enough for training modern machine learning algorithms. Furthermore, none of these datasets contains synthesis information such as temperature or pressure, which is critical to making materials with target properties. Thus, machine learning for experimental materials research so far has focused on adoption of existing algorithms suitable for relatively small but complex datasets, such as collections of x-ray diffraction patterns<ref name="MuellerMachine16">{{cite book |chapter=Chapter 4: Machine Learning in Materials Science |title=Reviews in Computational Chemistry |author=Mueller, T.; Kusne, A.G.; Ramprasad, R. |editor=Parrill, A.L.; Lipkowitz, K.B. |publisher=John Wiley & Sons, Inc |year=2016 |doi=10.1002/9781119148739.ch4}}</ref>, microscopy images<ref name="KalininBig15">{{cite journal |title=Big–deep–smart data in imaging for guiding materials design |journal=Nature Materials |author=Kalinin, S.V.; Sumpter, B.G.; Archibald, R.K. |volume=14 |pages=973–980 |year=2015 |doi=10.1038/nmat4395}}</ref>, or materials microstructure.<ref name="KalindindiMaterials15">{{cite journal |title=Materials Data Science: Current Status and Future Outlook |journal=Annual Review of Materials Research |author=Kalidindi, S.R.; De Graef, M. |volume=45 |pages=171-193 |year=2015 |doi=/10.1146/annurev-matsci-070214-020844}}</ref>


==References==
==References==

Revision as of 21:15, 9 April 2018

Full article title An open experimental database for exploring inorganic materials
Journal Scientific Data
Author(s) Zakutayev, Andriy; Wunder, Nick; Schwarting, Marcus; Perkins, John D.;
White, Robert; Munch, Kristin; Tumas, William; Phillips, Caleb
Author affiliation(s) National Renewable Energy Laboratory
Primary contact Email: See original article
Year published 2018
Volume and issue 5
Page(s) 180053
DOI 10.1038/sdata.2018.53
ISSN 2052-4463
Distribution license Creative Commons Attribution 4.0 International
Website https://www.nature.com/articles/sdata201853
Download https://www.nature.com/articles/sdata201853.pdf (PDF)

Abstract

The use of advanced machine learning algorithms in experimental materials science is limited by the lack of sufficiently large and diverse datasets amenable to data mining. If publicly open, such data resources would also enable materials research by scientists without access to expensive experimental equipment. Here, we report on our progress towards a publicly open High Throughput Experimental Materials (HTEM) Database (htem.nrel.gov). This database currently contains 140,000 sample entries, characterized by structural (100,000), synthetic (80,000), chemical (70,000), and optoelectronic (50,000) properties of inorganic thin film materials, grouped in >4,000 sample entries across >100 materials systems; more than a half of these data are publicly available. This article shows how the HTEM database may enable scientists to explore materials by browsing web-based user interface and an application programming interface. This paper also describes a HTE approach to generating materials data and discusses the laboratory information management system (LIMS) that underpins the HTEM database. Finally, this manuscript illustrates how advanced machine learning algorithms can be adopted to materials science problems using this open data resource.

Keywords: applied physics, electronic devices, materials chemistry, semiconductors, solar cells

Introduction

Machine learning is a branch of computer science concerned with algorithms that can develop models from the available data, reveal trends and correlation in this data, and make predictions about unavailable data. The predictions rely on data mining, the process of discovering patterns in large data sets using statistical methods. Machine learning methods have been recently successful in process automation, natural language processing, and computer vision, where large databases are available to support data-driven modeling efforts. These successes also sparked discussions about the potential of artificial intelligence in science[1] and the Fourth Paradigm[2] of data-driven scientific discovery. In materials science, applying artificial intelligence to data-driven materials discovery is important, because new materials often underpin major advances in modern technologies. For example, in advanced energy technologies, efficient solid state lighting was enabled by the use of gallium nitride in light-emitting diodes, electric cars were brought to life by intercalation materials used in lithium-ion batteries, and modern computers would not have been possible without the material silicon.

In computational materials science[3], machine learning methods have been recently used to predict structure[4], stability[5], and properties[6] of inorganic solid state materials. These results had been enabled by advances in simulation tools at multiple length scales.[7] The resulting simulated materials data are stored in ever-growing publicly-accessible computational property databases.[8][9][10] In contrast to computations, experimental materials discovery using machine learning is limited by the dearth of large and diverse datasets (Fig. 1). Large experimental datasets like the Inorganic Crystal Structure Database (ICSD)[11] contain hundreds of thousands of entries but are not diverse enough, as they contain only composition and structure of the materials. The diverse datasets like Landolt–Börnstein (http://materials.springer.com/)[12] or AtomWorks (http://crystdb.nims.go.jp/index_en.html)[13] contain hundreds to thousands of entries for different properties, so they are not large enough for training modern machine learning algorithms. Furthermore, none of these datasets contains synthesis information such as temperature or pressure, which is critical to making materials with target properties. Thus, machine learning for experimental materials research so far has focused on adoption of existing algorithms suitable for relatively small but complex datasets, such as collections of x-ray diffraction patterns[14], microscopy images[15], or materials microstructure.[16]


References

  1. "The AI revolution in science". Science. 7 July 2017. doi:10.1126/science.aan7064. http://www.sciencemag.org/news/2017/07/ai-revolution-science. 
  2. Hey, T.; Tansley, S.; Tolle, K. (2009). The Fourth Paradigm: Data-Intensive Scientific Discovery. Microsoft Research. ISBN 9780982544204. https://www.microsoft.com/en-us/research/publication/fourth-paradigm-data-intensive-scientific-discovery/. 
  3. Nosengo, N. (2016). "The Material Code: Machine-learning techniques could revolutionize how materials science is done". Nature 533: 22–25. doi:10.1038/533022a. 
  4. Meredig, B.; Agrawal, A.; Kirklin, S. et al. (2014). "Combinatorial screening for new materials in unconstrained composition space with machine learning". Physical Review B 89: 094104. doi:10.1103/PhysRevB.89.094104. 
  5. Hautier, G.; Fischer, C.C.; Jain, A. et al. (2010). "Finding Nature’s Missing Ternary Oxide Compounds Using Machine Learning and Density Functional Theory". Chemistry of Materials 22 (12): 3762–3767. doi:10.1021/cm100795d. 
  6. Carrete, J.; Li, W.; Mingo, N. et al. (2014). "Finding Unprecedentedly Low-Thermal-Conductivity Half-Heusler Semiconductors via High-Throughput Materials Modeling". Physical Review X 4: 011019. doi:10.1103/PhysRevX.4.011019. 
  7. Rajan, K., ed. (2013). Informatics for Materials Science and Engineering (1st ed.). Butterworth-Heinemann. pp. 542. ISBN 9780123946140. 
  8. Jain, A.; Ong, S.P.; Hautier, G. et al. (2013). "Commentary: The Materials Project: A materials genome approach to accelerating materials innovation". APL Materials 1 (1): 011002. doi:10.1063/1.4812323. 
  9. Curtarolo, S.; Setyawan, W.; Wang, S. et al. (2012). "AFLOWLIB.ORG: A distributed materials properties repository from high-throughput ab initio calculations". Computational Materials Science 58: 237–235. doi:10.1016/j.commatsci.2012.02.002. 
  10. Saal, J.E.; Kirklin, S.; Aykol, M. et al. (2013). "Materials Design and Discovery with High-Throughput Density Functional Theory: The Open Quantum Materials Database (OQMD)". JOM 65 (11): 1501–1509. doi:10.1007/s11837-013-0755-4. 
  11. Belsky, A.; Hellenbrandt, M.; Karen, V.L.; Luksch, P. (2002). "New developments in the Inorganic Crystal Structure Database (ICSD): Accessibility in support of materials research and design". Acta Crystallographica B 58 (Pt 3 Pt 1): 364–9. PMID 12037357. 
  12. Hellwege, K.H. (1967). "Landolt-Börnstein, Numerical Data and Functional Relationships in Science and Technology". American Journal of Physics 35 (3): 291–292. doi:10.1119/1.1974060. 
  13. Xu, Y.; Yamazaki, M.; Villars, P. (2011). "Inorganic Materials Database for Exploring the Nature of Material". Japanese Journal of Applied Physics 50: 11S. doi:10.1143/JJAP.50.11RH02. 
  14. Mueller, T.; Kusne, A.G.; Ramprasad, R. (2016). "Chapter 4: Machine Learning in Materials Science". In Parrill, A.L.; Lipkowitz, K.B.. Reviews in Computational Chemistry. John Wiley & Sons, Inc. doi:10.1002/9781119148739.ch4. 
  15. Kalinin, S.V.; Sumpter, B.G.; Archibald, R.K. (2015). "Big–deep–smart data in imaging for guiding materials design". Nature Materials 14: 973–980. doi:10.1038/nmat4395. 
  16. Kalidindi, S.R.; De Graef, M. (2015). "Materials Data Science: Current Status and Future Outlook". Annual Review of Materials Research 45: 171-193. doi:/10.1146/annurev-matsci-070214-020844. 

Notes

This presentation is faithful to the original, with only a few minor changes to presentation. In some cases important information was missing from the references, and that information was added.