Please wait a minute...
Frontiers of Physics

ISSN 2095-0462

ISSN 2095-0470(Online)

CN 11-5994/O4

Postal Subscription Code 80-965

2018 Impact Factor: 2.483

Front. Phys.    2024, Vol. 19 Issue (1) : 13501    https://doi.org/10.1007/s11467-023-1325-z
REVIEW ARTICLE
Advances of machine learning in materials science: Ideas and techniques
Sue Sin Chong1, Yi Sheng Ng1, Hui-Qiong Wang1,2(), Jin-Cheng Zheng1,2()
1. Department of New Energy Science and Engineering, Xiamen University Malaysia, Sepang 43900, Malaysia
2. Engineering Research Center of Micro-nano Optoelectronic Materials and Devices, Ministry of Education; Fujian Key Laboratory of Semiconductor Materials and Applications, CI Center for OSED, and Department of Physics, Xiamen University, Xiamen 361005, China
 Download: PDF(11699 KB)   HTML
 Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks
Abstract

In this big data era, the use of large dataset in conjunction with machine learning (ML) has been increasingly popular in both industry and academia. In recent times, the field of materials science is also undergoing a big data revolution, with large database and repositories appearing everywhere. Traditionally, materials science is a trial-and-error field, in both the computational and experimental departments. With the advent of machine learning-based techniques, there has been a paradigm shift: materials can now be screened quickly using ML models and even generated based on materials with similar properties; ML has also quietly infiltrated many sub-disciplinary under materials science. However, ML remains relatively new to the field and is expanding its wing quickly. There are a plethora of readily-available big data architectures and abundance of ML models and software; The call to integrate all these elements in a comprehensive research procedure is becoming an important direction of material science research. In this review, we attempt to provide an introduction and reference of ML to materials scientists, covering as much as possible the commonly used methods and applications, and discussing the future possibilities.

Keywords machine learning      materials science     
Corresponding Author(s): Hui-Qiong Wang,Jin-Cheng Zheng   
Online First Date: 14 September 2023    Issue Date: 14 September 2023
 Cite this article:   
Sue Sin Chong,Yi Sheng Ng,Hui-Qiong Wang, et al. Advances of machine learning in materials science: Ideas and techniques[J]. Front. Phys. , 2024, 19(1): 13501.
 URL:  
https://academic.hep.com.cn/fop/EN/10.1007/s11467-023-1325-z
https://academic.hep.com.cn/fop/EN/Y2024/V19/I1/13501
Fig.1  List of the conventional machine learning tasks and the problems tackled [68].
Fig.2  List of typical machine learning terminologies [68].
Fig.3  Illustration of the typical stages of a learning process [68].
Ideas & technique Relevant development and models
Pre-training Ref. [71], Ref. [74], BLIP [75], Pretrained transformers [76], Ref. [77]
Fine-tuning Ref. [78], Ref. [79]
Bidirectional encoder BERT [80], Albert [81], Robustly optimized BERT pre-training approach (RoBERTa) [82], CodeBERT [72], BeiT [73]
Transformer Ref. [69], Ref. [83], Ref. [84], Transfomer memory as search index [85]
Attention prompt Ref. [86], AutoPrompt [87], OpenPrompt [88]
Learning extra huge models Open pretrained transfomer (OPT 175B) [89], Jurassic-1 [90], Generative pre-trained transformer 3 (GPT-3) [91], CLD-3 [92]
End-to-end model Word2Vec [93], Global vectors for word representation (GLoVE) [94], Context2Vec [95], Structure2Vec [96], Driver2Vec [97], wav2Vec [98]
Tab.1  Natural language processing (NLP) ideas, techniques and models.
Ideas & techniques Relevant literature
Visual models Visual transformer [101], Flamingo (LM) [102]
Image-text processing CoCa [103], FuseDream [104], CLIP [105]
Convolutional neural network DEtection transformer (DETR)[106], LiT [107], Ref. [108]
Image rendering Dall-E [109], Review [110], Neural radiance field (NeRF) [111]
Point cloud reconstruction PointNorm [112], Ref. [113], Residual MLP [114], Learning on point cloud [115]
Tab.2  Computer vision (CV) ideas, techniques and references.
Ideas & techniques Relevant literature
Types of RL Q-learning [119], SARSA [118], Temporal difference (TD)-learning [123]
RL algorithm Self-training [124], Deep Q-learning (DQN) [125], Deep deterministic policy gradient (DDPG) [126], Offline [127]
Apprenticeship learning efficient RL SayCan[128], Q-attention [129], Imitation learning [130] Replay with compression [131], Decision transformer [132]
Evolving curriculum Adversarially compounding complexity by editing levels (ACCEL) [133], Paired open-ended trailblazer (POET) [134], Autonomous driving scene render (READ) [135]
Bandit problem Bandit learning [136], Batched bandit [137], Dueling bandit [138], Upper confidence bound (UCB) [139]
Tab.3  Reinforcement learning (RL) ideas, techniques and references.
Fig.4  Feature engineering for ML applications. (a) Feature extraction process. Starting from material space, one can extract information from material space into chemical structures then to descriptors space. (b) Typical ML feature analysis methods. “FEWD” refers to Filter method, Embedded method, Wrapper method, and Deep learning. (c) Correlation and importance analysis of selected features. The feature correlations is visualized in the diagram on the left. Diagram on the right is normalized version of left diagram, where the colors indicate the relative correlation of every other feature for prediction of the row/column feature. (d) Various feature subsets obtained from feature engineering analysis. One can construct features with linearly independent combination of subsets, in other words, subsets of features are basis. Reproduced with permission from Ref. [172].
Fig.5  Infographic of End-to-End Model. End-to-End models take multi-modal dataset as inputs, and encodes them into vectors for the surrogate model. The surrogate model then learns the latent representation, which makes the internal patterns of these datasets indexable. One is then able to decode the latent representation into an output form of our choice, which includes property predictions, generated novel materials and co-pilot simulation engines.
Fig.6  Schematic of the representation learning methods used in the structural characterization of catalysts, where the autoencoder, which includes the encoder and decoder, is used, with the input and output data being the same. Reproduced with permission from Ref. [176].
Fig.7  Depending on the degree of freedom (DOF) involved, the machine learning methodologies of the photonic design vary. The analytical methods that are suitable for DOF of order unity are replaced by the discriminative model of ML. As DOF increases, generative model is leveraged to bring down the dimensionality. Reproduced with permission from Ref. [178].
Data type Database
Computational data OQMD: Materials properties calculated from DFT [179,180], Materials project [181], Joint automated repository for various integrated simulations (JARVIS) [182], AFLOW [183], MatCloud [184], MPDS [185], NOMAD [186], C2DB [187], 2DMatPedia [188]
Crystallographic data ICSD [189], Crystallography open database (COD) [190], The NIST surface structure database (SSD4) [41], Aspherical electron scattering factors [191], AlphaFold [192]
Imaging/spectra data MatBench [193], TEMImageNet [194], Single-atom library [195]
Other types Knowledge graph, e.g., propnet [196]
Tab.4  Typical material science databases.
Library Library Description
General deep learning libraries (APIs) Deepmind Jax [201] Open ML codebase by Deepmind. With its updated version of Autograd, JAX can automatically differentiate native Python and NumPy code.
Keras [202] Free open source Python library for developing and evaluating deep learning models.
PyTorch [203] PyTorch is an open source machine learning framework based on the Torch library.
TensorFlow [204] Created by the Google Brain team, TensorFlow is an open source library for numerical computation and large-scale machine learning.
Useful libraries for machine learning tasks HuggingFace [205] Open NLP Library with Trained Models, API and Dataset Loaders.
OpenRefine [206] OpenRefine is an open-source desktop application for data cleanup and transformation to other formats, an activity commonly known as data wrangling.
PyTorch Geometric [207] PyG (PyTorch Geometric) is a library built upon PyTorch to easily write and train Graph Neural Networks (GNNs) for a wide range of applications related to structured data.
PyTorch lightning [208] PyTorch lightning is the deep learning framework for professional AI researchers and machine learning engineers who need maximal flexibility without sacrificing performance at scale.
VectorFlow [209] Optimized for sparse data in single machine environment.
Weights & Biases [210] W&B for experiment tracking, dataset versioning, and collaborating on ML projects.
Tools that might be useful to material science Dscribe [211] Provides popular feature transformations (“descriptors”) for atomistic materials simulations, including Coulomb matrix, Ewald sum matrix, sine matrix, Many-body tensor representation (MBTR), Atom-centered symmetry funsction (ACSF) and Smooth overlap of atomic positions (SOAP).
Open graph database [212] The open graph benchmark (OGB) is a collection of realistic, large-scale, and diverse benchmark datasets for machine learning on graphs. OGB datasets are automatically downloaded, processed, and split using the OGB Data Loader.
RDKit [213] Opensource library for converting molecules to SMILES string.
Spektral [214] Spektral is a Python library for graph deep learning, based on the Keras API and TensorFlow 2.
Tab.5  Machine learning libraries. All descriptions were adapted from the references therein.
Fig.8  (a) The mathematical description of the Weyl and Coulomb matrices. (b) The construction of the PRDF sums, where atoms covered by the yellow strip covering the radius (r,r +dr) are considered. (b) Reproduced with permission from Ref. [221].
Fig.9  (a) Structure graph for 2,3,4-trimethylhexane and (b) the related adjacency and distance matrix. Reproduced with permission from Ref. [231]. (c) The Universal fragment descriptors. The crystal structure is analysed for atomic neighbours via Voronoi tessellation with the infinite periodicity taken into account. Reproduced with permission from Ref. [232].
Fig.10  Crystal graph construction proposed used in the generalized crystal graph convolutional neural networks. Reproduced with permission from Ref. [233].
Fig.11  (a) Left to right: 0-, 1-, 2-, 3-simplex. (b) An example of a simplicial complex, with five vertices: a, b, c, d, and e, six 1-simplices: A, B, C, D, E, and F, and one 2-simplex T. The Betti numbers for this complex are β0=β1=1. Reproduced with permission from Ref. [238].
Fig.12  Persistence barcode plot for the selected Na atom inside a NaCl crystal, surrounded by only (a) Na atoms and (b) Cl atoms. (c) Construction of crystal topological descriptor, taking into account different chemical environmen t. Reproduced with permission from Ref. [236].
Fig.13  (a) Experimental XRD method, where X-ray plane wave incidents on a crystal, resulting in diffraction fingerprints. (b, c) XRD-based image descriptor for a crystal where each RGB colour corresponds to rotation about the x, y, z axes. The robustness of the descriptor against defects can be observed by comparing (b) to (c). (d) Examples of 1D XRD. (a−c) Reproduced with permission from Ref. [239], (d) Reproduced with permission from Ref. [241].
ML algorithms Tool
Support vector machine (SVM) Refs. [260, 261, 262, 246]
Kernel ridge regression (KRR) Refs. [237, 263, 247, 264]
Deep neural network VampNet [257], DTNN [265], ElemNet [266], IrNet [267], PhysNet [268], DeepMolNet [269], SIPFENN [270], SpookyNet [250]
Convolutional neural network (CNN) SchNet [271], Refs. [239, 240, 272, 273]
Graph neural network (GNN) CGCNN [274], MEGNet [275], GATGNN [276], OrbNet [277], DimeNET [278], ALIGNN [279], MXMNet [280], GraphVAMPNet [281], GdyNets [282], NequIP [283], PaiNN [284], CCCGN [285, 286], FFiNet [287]
Generative adversarial networks (GAN) Ref. [288], CrystalGAN [246], MatGAN [289]
Variational auto encoder (VAE) FTCP [290], CDVAE [291], Refs. [292, 263]
Random forest/ decision tree Refs. [236, 293, 294, 251, 295, 296]
Unsupervised clustering Refs. [241, 282, 252, 297, 298]
Transfer learning Roost [299], AtomSets [288], XenonPy.MDL [289], TDL [290], Refs. [256, 291, 292, 300, 301]
Tab.6  List of Machine Learning (ML) algorithms used by various tools or framework developed in materials science.
Fig.14  (a) Neural network (NN) with 3 layers: input, hidden, and output. (b) Deep NN with 3 hidden layers. Reproduced with permission from Ref. [257].
Fig.15  The CNN architecture used in the work of Ziletti et al. [239]. (a) A kernel or learnable filter is applied all over the image, taking scalar product between the filter and the image data at every point, resulting in an activation map. This process is repeated in (b), which is then coarse grained in (c), reducing the dimension. The map is then transferred to regular NNs hidden layers (d) before it is used to classify the crystal structure (e). Reproduced with permission from Ref. [239].
Fig.16  (a) An example of a decision tree, where each square represents internal node or feature, each arrow represents branch or decision rule, and the green circles are leafs representing class labels or numerical values. (b) Dendogram obtained via agglomerative hierarchical clustering (AHC) where the dashed line indicates the optimal clustering. (a) Reproduced with permission from Ref. [258], (b) Reproduced with permission from Ref. [241].
Fig.17  The architectures of the two generative models, Generative adversarial networks (GAN) and Variational auto encoders (VAE). Reproduced with permission from Ref. [259].
Fig.18  (a) Composition-conditioned crystal GAN, designed to generate crystals that can be applied in photodiode. (b) Simplified VAE architecture used in the inverse design of VxOy materials. (a) Reproduced with permission from Ref. [253], (b) Reproduced with permission from Ref. [254].
Fig.19  (a) Overview of explainable DNNs approaches. (b) Feature visualization in the form of heat map used in determining the ionic conductivity from SEM images. (a) Reproduced with permission from Ref. [302], (b) Reproduced with permission from Ref. [305].
Fig.20  (a) The architecture of CrysXPP, which is capable of producing explainable results, as seen in (b) the bar chart of features affecting the band gap of GaP crystal. Reproduced with permission from Ref. [306].
Fig.21  High-throughput screening with learnt interatomic potential embedding from Ref. [330]. With the integration of active learning and DFT in the screening pipeline, the throughput efficiency or the quality of the output obtained from calculation can be improved. Reproduced with permission from Ref. [330].
Fig.22  Schematics of generative adversarial network. Reproduced with permission from Ref. [332].
1 Weinan E.. Machine learning and computational mathematics. Commun. Comput. Phys., 2020, 28: 1639
https://doi.org/10.4208/cicp.OA-2020-0185
2 Agrawal A., Choudhary A.. Perspective: Materials informatics and big data: Realization of the “fourth paradigm” of science in materials science. APL Mater., 2016, 4(5): 053208
https://doi.org/10.1063/1.4946894
3 Xu Y., Liu X., Cao X., Huang C., Liu E.. et al.. Artificial intelligence: A powerful paradigm for scientific research. The Innovation, 2021, 2(4): 100179
https://doi.org/10.1016/j.xinn.2021.100179
4 Carleo G., Cirac I., Cranmer K., Daudet L., Schuld M., Tishby N., Vogt-Maranto L., Zdeborová L.. Machine learning and the physical sciences. Rev. Mod. Phys., 2019, 91: 045002
https://doi.org/10.1103/RevModPhys.91.045002
5 R. Schleder G., C. M. Padilha A., M. Acosta C., Costa M., Fazzio A.. From DFT to machine learning: Recent approaches to materials science – A review. J. Phys.: Mater., 2019, 2(3): 032001
https://doi.org/10.1088/2515-7639/ab084b
6 Potyrailo R., Rajan K., Stoewe K., Takeuchi I., Chisholm B., Lam H.. Combinatorial and high-throughput screening of materials libraries: Review of state of the art. ACS Combin. Sci., 2011, 13(6): 579
https://doi.org/10.1021/co200007w
7 Alberi K., B. Nardelli M., Zakutayev A., Mitas L., Curtarolo S.. et al.. The 2019 materials by design roadmap. J. Phys. D, 2018, 52(1): 013001
https://doi.org/10.1088/1361-6463/aad926
8 Torquato S.. Optimal design of heterogeneous materials. Ann. Rev. Mater. Res., 2010, 40: 101
https://doi.org/10.1146/annurev-matsci-070909-104517
9 A. White A.. Big data are shaping the future of materials science. MRS Bull., 2013, 38(8): 594
https://doi.org/10.1557/mrs.2013.187
10 Fan Z., Q. Wang H., C. Zheng J.. Searching for the best thermoelectrics through the optimization of transport distribution function. J. Appl. Phys., 2011, 109(7): 073713
https://doi.org/10.1063/1.3563097
11 C. Zheng J., Zhu Y.. Searching for a higher superconducting transition temperature in strained MgB2. Phys. Rev. B, 2006, 73: 024509
https://doi.org/10.1103/PhysRevB.73.024509
12 C. Zheng J.. Asymmetrical transport distribution function: Skewness as a key to enhance thermoelectric performance. Research, 2022, 2022: 9867639
https://doi.org/10.34133/2022/9867639
13 C. Zheng J.. Recent advances on thermoelectric materials. Front. Phys. China, 2008, 3(3): 269
https://doi.org/10.1007/s11467-008-0028-9
14 C. Zheng J., I. Frenkel A., Wu L., Hanson J., Ku W., S. Bozin E., J. L. Billinge S., Zhu Y.. Nanoscale disorder and local electronic properties of CaCu3Ti4O12: An integrated study of electron, neutron, and X-ray diffraction, X-ray absorption fine structure, and first-principles calculations. Phys. Rev. B, 2010, 81(14): 144203
https://doi.org/10.1103/PhysRevB.81.144203
15 Sa N., S. Chong S., Q. Wang H., C. Zheng J.. Anisotropy engineering of ZnO nanoporous frameworks: A lattice dynamics simulation. Nanomaterials (Basel), 2022, 12(18): 3239
https://doi.org/10.3390/nano12183239
16 Cheng H.C. Zheng J., Ab initio study of anisotropic mechanical and electronic properties of strained carbon-nitride nanosheet with interlayer bonding, Front. Phys. 16(4), 43505 (2021)
17 Huang Y., Y. Haw C., Zheng Z., Kang J., C. Zheng J., Q. Wang H.. Biosynthesis of zinc oxide nanomaterials from plant extracts and future green prospects: A topical review. Adv. Sustain. Syst., 2021, 5(6): 2000266
https://doi.org/10.1002/adsu.202000266
18 Q. Wang Z.Cheng H.Y. Lü T.Q. Wang H.P. Feng Y. C. Zheng J., A super-stretchable boron nanoribbon network, Phys. Chem. Chem. Phys. 20(24), 16510 (2018)
19 Li Y., Q. Wang H., J. Chu T., C. Li Y., Li X., Liao X., Wang X., Zhou H., Kang J., C. Chang K., C. Chang T., M. Tsai T., C. Zheng J.. Tuning the nanostructures and optical properties of undoped and N-doped ZnO by supercritical fluid treatment. AIP Adv., 2018, 8(5): 055310
https://doi.org/10.1063/1.5026446
20 L. Li Y., Fan Z., C. Zheng J.. Enhanced thermoelectric performance in graphitic ZnO (0001) nanofilms. J. Appl. Phys., 2013, 113(8): 083705
https://doi.org/10.1063/1.4792469
21 He J., D. Blum I., Q. Wang H., N. Girard S., Doak J., D. Zhao L., C. Zheng J., Casillas G., Wolverton C., Jose-Yacaman M., N. Seidman D., G. Kanatzidis M., P. Dravid V.. Morphology control of nanostructures: Na-doped PbTe–PbS system. Nano Lett., 2012, 12(11): 5979
https://doi.org/10.1021/nl303449x
22 Fan Z., Zheng J., Q. Wang H., C. Zheng J.. Enhanced thermoelectric performance in three-dimensional superlattice of topological insulator thin films. Nanoscale Res. Lett., 2012, 7(1): 570
https://doi.org/10.1186/1556-276X-7-570
23 Wei N., Q. Wang H., C. Zheng J.. Nanoparticle manipulation by thermal gradient. Nanoscale Res. Lett., 2012, 7(1): 154
https://doi.org/10.1186/1556-276X-7-154
24 Wei N., Fan Z., Q. Xu L., P. Zheng Y., Q. Wang H., C. Zheng J.. Knitted graphene-nanoribbon sheet: A mechanically robust structure. Nanoscale, 2012, 4(3): 785
https://doi.org/10.1039/C1NR11200G
25 Q. He J., R. Sootsman J., Q. Xu L., N. Girard S., C. Zheng J., G. Kanatzidis M., P. Dravid V.. Anomalous electronic transport in dual-nanostructured lead telluride. J. Am. Chem. Soc., 2011, 133(23): 8786
https://doi.org/10.1021/ja2006498
26 Wei N., Xu L., Q. Wang H., C. Zheng J.. Strain engineering of thermal conductivity in graphene sheets and nanoribbons: A demonstration of magic flexibility. Nanotechnology, 2011, 22(10): 105705
https://doi.org/10.1088/0957-4484/22/10/105705
27 He J., R. Sootsman J., N. Girard S., C. Zheng J., Wen J., Zhu Y., G. Kanatzidis M., P. Dravid V.. On the origin of increased phonon scattering in nanostructured PbTe-based thermoelectric materials. J. Am. Chem. Soc., 2010, 132(25): 8669
https://doi.org/10.1021/ja1010948
28 Zhu Y., C. Zheng J., Wu L., I. Frenkel A., Hanson J., Northrup P., Ku W.. Nanoscale disorder in CaCu3Ti4O12: A new route to the enhanced dielectric response. Phys. Rev. Lett., 2007, 99(3): 037602
https://doi.org/10.1103/PhysRevLett.99.037602
29 C. Zheng J., Q. Wang H., T. S. Wee A., H. A. Huan C.. Structural and electronic properties of Al nanowires: An ab initio pseudopotential study. Int. J. Nanosci., 2002, 01(02): 159
https://doi.org/10.1142/S0219581X02000097
30 C. Zheng J., Q. Wang H., T. S. Wee A., H. A. Huan C.. Possible complete miscibility of (BN)x(C2)1−x alloys. Phys. Rev. B, 2002, 66(9): 092104
https://doi.org/10.1103/PhysRevB.66.092104
31 C. Zheng J., Q. Wang H., H. A. Huan C., T. S. Wee A.. The structural and electronic properties of (AlN)x(C2)1−x and (AlN)x(BN)1−x alloys. J. Phys.: Condens. Matter, 2001, 13(22): 5295
https://doi.org/10.1088/0953-8984/13/22/322
32 Q. Wang H., C. Zheng J., Z. Wang R., M. Zheng Y., H. Cai S.. Valence-band offsets of III−V alloy heterojunctions. Surf. Interface Anal., 1999, 28(1): 177
https://doi.org/10.1002/(SICI)1096-9918(199908)28:1<177::AID-SIA602>3.0.CO;2-T
33 C. Zheng J., Z. Wang R., M. Zheng Y., H. Cai S.. Valence offsets of three series of alloy heterojunctions. Chin. Phys. Lett., 1997, 14(10): 775
https://doi.org/10.1088/0256-307X/14/10/015
34 C. Zheng J., Zheng Y., Wang R.. Valence offsets of ternary alloy heterojunctions InxGa1-xAs/InxAl1-xAs. Chin. Sci. Bull., 1996, 41(24): 2050
35 Liu L., Wang T., Sun L., Song T., Yan H., Li C., Mu D., Zheng J., Dai Y.. Stable cycling of all‐solid‐state lithium metal batteries enabled by salt engineering of PEO‐based polymer electrolytes. Energy Environ. Mater., 2023, (Feb.): e12580
https://doi.org/10.1002/eem2.12580
36 Zhang W., Y. Du F., Dai Y., C. Zheng J.. Strain engineering of Li+ ion migration in olivine phosphate cathode materials LiMPO4 (M = Mn, Fe, Co) and (LiFePO4)n(LiMnPO4)m superlattices. Phys. Chem. Chem. Phys., 2023, 25(8): 6142
https://doi.org/10.1039/D2CP05241E
37 Zhang B., Wu L., Zheng J., Yang P., Yu X., Ding J., M. Heald S., A. Rosenberg R., V. Venkatesan T., Chen J., J. Sun C., Zhu Y., M. Chow G.. Control of magnetic anisotropy by orbital hybridization with charge transfer in (La0.67Sr0.33MnO3)n/(SrTiO3)n superlattice. NPG Asia Mater., 2018, 10(9): 931
https://doi.org/10.1038/s41427-018-0084-8
38 Zhang L., Y. Lü T., Q. Wang H., X. Zhang W., W. Yang S., C. Zheng J.. First principles studies on the thermoelectric properties of (SrO)m(SrTiO3)n superlattice. RSC Adv., 2016, 6(104): 102172
https://doi.org/10.1039/C6RA19661F
39 C. Zheng J., H. A. Huan C., T. S. Wee A., A. V. Hove M., S. Fadley C., J. Shi F., Rotenberg E., R. Barman S., J. Paggel J., Horn K., Ebert P., Urban K.. Atomic scale structure of the 5-fold surface of a AlPdMn quasicrystal: A quantitative X-ray photoelectron diffraction analysis. Phys. Rev. B, 2004, 69(13): 134107
https://doi.org/10.1103/PhysRevB.69.134107
40 Q. Wang H., Xu J., Lin X., Li Y., Kang J., C. Zheng J.. Determination of the embedded electronic states at nanoscale interface via surface-sensitive photoemission spectroscopy. Light Sci. Appl., 2021, 10(1): 153
https://doi.org/10.1038/s41377-021-00592-9
41 A. Van Hove M., Hermann K., R. Watson P.. The NIST surface structure database – SSD version 4. Acta Crystallogr. B, 2002, 58(3): 338
https://doi.org/10.1107/S0108768102002434
42 Q. Wang H., Altman E., Broadbridge C., Zhu Y., Henrich V.. Determination of electronic structure of oxide-oxide interfaces by photoemission spectroscopy. Adv. Mater., 2010, 22: 2950
https://doi.org/10.1002/adma.200903759
43 Zhou H., Wu L., Q. Wang H., C. Zheng J., Zhang L., Kisslinger K., Li Y., Wang Z., Cheng H., Ke S., Li Y., Kang J., Zhu Y.. Interfaces between hexagonal and cubic oxides and their structure alternatives. Nat. Commun., 2017, 8(1): 1474
https://doi.org/10.1038/s41467-017-01655-5
44 D. Steiner J., Cheng H., Walsh J., Zhang Y., Zydlewski B., Mu L., Xu Z., M. Rahman M., Sun H., M. Michel F., J. Sun C., Nordlund D., Luo W., C. Zheng J., L. Xin H., Lin F.. Targeted surface doping with reversible local environment improves oxygen stability at the electrochemical interfaces of nickel-rich cathode materials. ACS Appl. Mater. Interfaces, 2019, 11(41): 37885
https://doi.org/10.1021/acsami.9b14729
45 C. Zheng J., Q. Wang H., T. S. Wee A., H. A. Huan C.. Trends in bonding configuration at SiC/III–V semiconductor interfaces. Appl. Phys. Lett., 2001, 79(11): 1643
https://doi.org/10.1063/1.1402162
46 Q. Wang H.C. Zheng J.T. S. Wee A.H. A. Huan C., Study of electronic properties and bonding configuration at the BN/SiC interface, J. Electron Spectrosc. Relat. Phenom. 114–116, 483 (2001)
47 Lin S., Zhang B., Y. Lü T., C. Zheng J., Pan H., Chen H., Lin C., Li X., Zhou J.. Inorganic lead-free B-γ-CsSnI 3 perovskite solar cells using diverse electron-transporting materials: A simulation study. ACS Omega, 2021, 6(40): 26689
https://doi.org/10.1021/acsomega.1c04096
48 Y. Du F., Zhang W., Q. Wang H., C. Zheng J.. Enhancement of thermal rectification by asymmetry engineering of thermal conductivity and geometric structure for the multi-segment thermal rectifier. Chin. Phys. B, 2023, 32(6): 064402
https://doi.org/10.1088/1674-1056/acc78c
49 Kulichenko M., S. Smith J., Nebgen B., W. Li Y., Fedik N., I. Boldyrev A., Lubbers N., Barros K., Tretiak S.. The rise of neural networks for materials and chemical dynamics. J. Phys. Chem. Lett., 2021, 12(26): 6227
https://doi.org/10.1021/acs.jpclett.1c01357
50 Sha W., Guo Y., Yuan Q., Tang S., Zhang X., Lu S., Guo X., C. Cao Y., Cheng S.. Artificial intelligence to power the future of materials science and engineering. Adv. Intell. Syst., 2020, 2(4): 1900143
https://doi.org/10.1002/aisy.201900143
51 Leonelli S., Scientific research and big data, in: The Stanford Encyclopedia of Philosophy, Summer 2020 Ed., edited by E. N. Zalta, Metaphysics Research Lab, Stanford University, 2020
52 Westermayr J., Gastegger M., T. Schütt K., J. Maurer R.. Perspective on integrating machine learning into computational chemistry and materials science. J. Chem. Phys., 2021, 154(23): 230903
https://doi.org/10.1063/5.0047760
53 Morgan D., Jacobs R.. Opportunities and challenges for machine learning in materials science. Annu. Rev. Mater. Res., 2020, 50(1): 71
https://doi.org/10.1146/annurev-matsci-070218-010015
54 Chen C.Zuo Y.Ye W.Li X.Deng Z. P. Ong S., A critical review of machine learning of energy materials, Adv. Energy Mater. 10(8), 1903242 (2020)
55 Wei J., Chu X., Y. Sun X., Xu K., X. Deng H., Chen J., Wei Z., Lei M.. Machine learning in materials science. InfoMat, 2019, 1(3): 338
https://doi.org/10.1002/inf2.12028
56 Pilania G.. Machine learning in materials science: From explainable predictions to autonomous design. Comput. Mater. Sci., 2021, 193: 110360
https://doi.org/10.1016/j.commatsci.2021.110360
57 T. Butler K., W. Davies D., Cartwright H., Isayev O., Walsh A.. Machine learning for molecular and materials science. Nature, 2018, 559(7715): 547
https://doi.org/10.1038/s41586-018-0337-2
58 Oviedo F., L. Ferres J., Buonassisi T., T. Butler K.. Interpretable and explainable machine learning for materials science and chemistry. Acc. Mater. Res., 2022, 3(6): 597
https://doi.org/10.1021/accountsmr.1c00244
59 F. Rodrigues Jr J., C. F. Florea M., de Oliveira D., Diamond D., N. Oliveira Jr O.. Big data and machine learning for materials science. Discover Materials, 2021, 1(1): 12
https://doi.org/10.1007/s43939-021-00012-0
60 Choudhary K., DeCost B., Chen C., Jain A., Tavazza F., Cohn R., W. Park C., Choudhary A., Agrawal A., J. L. Billinge S., Holm E., P. Ong S., Wolverton C.. Recent advances and applications of deep learning methods in materials science. npj Comput. Mater., 2022, 8: 59
https://doi.org/10.1038/s41524-022-00734-6
61 Samuel L.. Some studies in machine learning using the game of checkers. IBM J. Res. Develop., 1959, 3(3): 210
https://doi.org/10.1147/rd.33.0210
62 Breiman L.H. Friedman J.A. Olshen R.J. Stone C., Classification and Regression Trees, 1983
63 G. Valiant L., A theory of the learnable, in: STOC ’84 Proceedings of the Sixteenth Annual ACM Symposium on Theory of Computing, pp 436–445, 1984
64 T. Mitchell, Machine Learning, New York, USA: McGrawHill, 1997
65 Roweis S.Ghahramani Z., A unifying review of linear gaussian models, Neural Comput. 11(2), 305 (1999)
66 C. Zheng J., Y. Chen J., W. Shuai J., H. Cai S., Z. Wang R.. Storage capacity of the Hopfield neural network. Physica A, 1997, 246(3): 313
https://doi.org/10.1016/S0378-4371(97)00359-2
67 W. Shuai J., C. Zheng J., X. Chen Z., T. Liu R., X. Wu B.. The three-dimensional rotation neural network. Physica A, 1997, 238): 23
https://doi.org/10.1016/S0378-4371(96)00465-7
68 Mohri M.Rostamizadeh A.Talwalkar A., Foundations of Machine Learning, 2nd Ed. , Adaptive Computation and Machine Learning. Cambridge, MA: MIT Press, 2018
69 Vaswani A.Shazeer N.Parmar N.Uszkoreit J.Jones L. N. Gomez A.Kaiser L.Polosukhin I., Attention is all you need, arXiv: 1706.03762 (2017)
70 Wang A.Pruksachatkun Y.Nangia N.Singh A.Michael J. Hill F.Levy O.R. Bowman S., SuperGLUE: A stickier benchmark for general-purpose language understanding systems, arXiv: 1905.00537 (2019)
71 Erhan D., Bengio Y., Courville A., A. Manzagol P., Vincent P., Bengio S.. Why does unsupervised pre-training help deep learning. J. Mach. Learn. Res., 2010, 11: 625
72 Feng Z.Guo D.Tang D.Duan N.Feng X. Gong M.Shou L.Qin B.Liu T.Jiang D. Zhou M., CodeBERT: A pre-trained model for programming and natural languages, arXiv: 2002.08155 (2020)
73 Bao H.Dong L.Wei F., BEIT: BERT pre-training of image transformers, arXiv: 2106.08254 (2021)
74 Hakhamaneshi K.Nassar M.Phielipp M. Abbeel P.Stojanović V., Pretraining graph neural networks for few-shot analog circuit modeling and design, arXiv: 2203.15913 (2022)
75 Li J.Li D. Xiong C.Hoi S., BLIP: Bootstrapping language-image pre-training for unified vision-language understanding and generation, arXiv: 2201.12086 (2022)
76 Lu K.Grover A. Abbeel P.Mordatch I., Pretrained transformers as universal computation engines, arXiv: 2103.05247 (2021)
77 Reid M.Yamada Y.S. Gu S., Can Wikipedia help offline reinforcement learning? arXiv: 2201.12122 (2022)
78 Sun C.Qiu X.Xu Y.Huang X., How to fine-tune BERT for text classification? arXiv: 1905.05583 (2019)
79 Liu H., Tam D., Muqeeth M., Mohta J., Huang T., Bansal M., Raffel C.. Few-shot parameter-efficient fine-tuning is better and cheaper than in-context learning. Advances in Neural Information Processing Systems, 2022, 35: 1950
80 Devlin J.W. Chang M.Lee K.Toutanova K., BERT: Pre-training of deep bidirectional transformers for language understanding, arXiv: 1810.04805 (2018)
81 Lan Z.Chen M.Goodman S.Gimpel K.Sharma P. Soricut R., ALBERT: A lite BERT for self-supervised learning of language representations, arXiv: 1909.11942 (2019)
82 Liu Y.Ott M.Goyal N.Du J.Joshi M. Chen D.Levy O.Lewis M.Zettlemoyer L.Stoyanov V., ROBERTA: A robustly optimized BERT pretraining approach, arXiv: 1907.11692 (2019)
83 Vig J.Belinkov Y., Analyzing the structure of attention in a transformer language model, arXiv: 1906.04284 (2019)
84 Zhang S.Xie L., Improving attention mechanism in graph neural networks via cardinality preservation, in: Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, International Joint Conferences on Artificial Intelligence Organization, 2020, page 1395
85 Tay Y., Q. Tran V., Dehghani M., Ni J., Bahri D., Mehta H., Qin Z., Hui K., Zhao Z., Gupta J., Schuster T., W. Cohen W., Metzler D.. Transformer memory as a differentiable search index. Advances in Neural Information Processing Systems, 2022, 35: 21831
86 Raffel C., Shazeer N., Roberts A., Lee K., Narang S., Matena M., Zhou Y., Li W., J. Liu P.. Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res., 2020, 21(1): 5485
87 Shin T.Razeghi Y.L. L. IV R.Wallace E.Singh S., AutoPrompt: Eliciting knowledge from language models with automatically generated prompts, arXiv: 2010.15980 (2020)
88 Ding N.Hu S.Zhao W.Chen Y.Liu Z. -T. Zheng H.Sun M., Openprompt: An open-source framework for prompt-learning, arXiv: 2111.01998 (2021)
89 Zhang S.Roller S.Goyal N.Artetxe M.Chen M. Chen S.Dewan C.Diab M.Li X.V. Lin X. Mihaylov T.Ott M.Shleifer S.Shuster K.Simig D. S. Koura P.Sridhar A.Wang T.Zettlemoyer L., OPT: Open pre-trained transformer language models, arXiv: 2205.01068 (2022)
90 Lieber O.Sharir O.Lenz B.Shoham Y., Jurassic-1: Technical Details and Evaluation, AI21 Labs, Tech. Rep., 2021
91 Brown T.Mann B.Ryder N.Subbiah M.D. Kaplan J., et al.., Language models are few-shot learners, in: Advances in Neural Information Processing Systems, edited by H. Larochelle, M. Ranzato, R. Hadsell, M. Balcan, and H. Lin, 33 Curran Associates, Inc., 2020, pp 1877–1901, arXiv: 2005.14165
92 Bapna A.Caswell I.Kreutzer J.Firat O.van Esch D., et al.., Building machine translation systems for the next thousand languages, arXiv: 2205.03983 (2022)
93 Mikolov T.Chen K.Corrado G.Dean J., Efficient estimation of word representations in vector space, arXiv: 1301.3781 (2013)
94 Pennington J.Socher R.Manning C., GloVe: Global vectors for word representation, in: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). Doha, Qatar: Association for Computational Linguistics, Oct. 2014, pp 1532–1543
95 Melamud O.Goldberger J.Dagan I., Context2vec: Learning generic context embedding with bidirectional LSTM, in: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning. Berlin, Germany: Association for Computational Linguistics, Aug. 2016, pp 51–61
96 Dai H.Dai B.Song L., Discriminative embeddings of latent variable models for structured data, arXiv: 1603.05629 (2016)
97 Yang J.Zhao R.Zhu M.Hallac D.Sodnik J. Leskovec J., Driver2vec: Driver identification from automotive data, arXiv: 2102.05234 (2021)
98 Schneider S.Baevski A.Collobert R.Auli M., Wav2vec: Unsupervised pre-training for speech recognition, arXiv: 1904.05862 (2019)
99 Zhou H.Zhang S.Peng J.Zhang S.Li J. Xiong H.Zhang W., Informer: Beyond efficient transformer for long sequence time-series forecasting, in: Proceedings of the AAAI Conference on Artificial Intelligence 35(12), 11106 (2021), arXiv: 2012.07436
100 Beltagy I.E. Peters M.Cohan A., Longformer: The long-document transformer, arXiv: 2004.05150 (2020)
101 Han K.Wang Y.Chen H.Chen X.Guo J. Liu Z.Tang Y.Xiao A.Xu C.Xu Y. Yang Z.Zhang Y.Tao D., A survey on vision transformer, IEEE Trans. Pattern Anal. Mach. Intell. 45(1), 87 (2023)
102 B. Alayrac J.Donahue J.Luc P. Miech A.Barr I., et al.., Flamingo: A visual language model for few-shot learning, Advances in Neural Information Processing Systems 35, 23716 (2022), arXiv: 2204.14198
103 Yu J.Wang Z. Vasudevan V.Yeung L.Seyedhosseini M.Wu Y., COCA: Contrastive captioners are image-text foundation models, arXiv: 2205.01917 (2022)
104 Liu X.Gong C.Wu L.Zhang S.Su H. Liu Q., Fusedream: Training-free text-to-image generation with improved CLIP+GAN space optimization, arXiv: 2112.01573 (2021)
105 Radford A.W. Kim J.Hallacy C.Ramesh A.Goh G. Agarwal S.Sastry G.Askell A.Mishkin P.Clark J. Krueger G.Sutskever I., Learning transferable visual models from natural language supervision, arXiv: 2103.00020 (2021)
106 He L.Zhou Q. Li X.Niu L. Cheng G.Li X.Liu W.Tong Y.Ma L. Zhang L., End-to-end video object detection with spatial-temporal transformers, in: Proceedings of the 29th ACM International Conference on Multimedia, 2021, pp 1507–1516, arXiv: 2105.10920
107 Zhai X.Wang X.Mustafa B.Steiner A.Keysers D. Kolesnikov A.Beyer L., LIT: Zero-shot transfer with locked-image text tuning, in: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp 18123–18133, arXiv: 2111.07991
108 Trockman A.Z. Kolter J., Patches are all you need? arXiv: 2201.09792 (2022)
109 Ramesh A.Pavlov M.Goh G.Gray S.Voss C. Radford A.Chen M.Sutskever I., Zeroshot text-to-image generation, in: International Conference on Machine Learning, 2021, pp 8821–8831, arXiv: 2102.12092
110 Tewari A.Thies J.Mildenhall B.Srinivasan P.Tretschk E. Wang Y.Lassner C.Sitzmann V.Martin-Brualla R.Lombardi S.Simon T.Theobalt C.Niessner M. T. Barron J.Wetzstein G.Zollhoefer M.Golyanik V., Advances in neural rendering, Computer Graphics Forum 41(2), 703 (2022), arXiv: 2111.05849
111 Mildenhall B.P. Srinivasan P.Tancik M. T. Barron J.Ramamoorthi R.Ng R., NERF: Representing scenes as neural radiance fields for view synthesis, Communications of the ACM 65(1), 99 (2021), arXiv: 2003.08934
112 Zheng S.Pan J.Lu C.Gupta G., Pointnorm: Normalization is all you need for point cloud analysis, arXiv: 2207.06324 (2022)
113 Ran H.Liu J.Wang C., Surface representation for point clouds, in: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp 18942–18952, arXiv: 2205.05740
114 Ma X.Qin C. You H.Ran H.Fu Y., Rethinking network design and local geometry in point cloud: A simple residual MLP framework, arXiv: 2202.07123 (2022)
115 Wang Y.Sun Y.Liu Z.E. Sarma S.M. Bronstein M.M. Solomon J., Dynamic graph CNN for learning on point clouds, arXiv: 1801.07829 (2018)
116 Silver D., Schrittwieser J., Simonyan K., Antonoglou I., Huang A., Guez A., Hubert T., Baker L., Lai M., Bolton A., Chen Y., Lillicrap T., Hui F., Sifre L., van den Driessche G., Graepel T., Hassabis D.. Mastering the game of Go without human knowledge. Nature, 2017, 550(7676): 354
https://doi.org/10.1038/nature24270
117 Zhao E.Yan R.Li J.Li K.Xing J., Alphaholdem: High-performance artificial intelligence for heads-up no-limit poker via end-to-end reinforcement learning, in: Proceedings of the AAAI Conference on Artificial Intelligence 36(4), 4689 (2022)
118 Zou S.Xu T.Liang Y., Finite-sample analysis for SARSA with linear function approximation, arXiv: 1902.02234 (2019)
119 J. C. H. Watkins C.Dayan P., Q-learning, Machine Learning 8(3), 279 (1992)
120 Abbeel P.Y. Ng A., Apprenticeship learning via inverse reinforcement learning, in Proceedings of the Twenty-First International Conference on Machine Learning, Ser. ICML ’04. New York, NY, USA: Association for Computing Machinery, 2004
121 Finn C.Abbeel P.Levine S., Model-agnostic meta-learning for fast adaptation of deep networks, In International conference on machine learning, 2017, pp 1126–1135, arXiv: 1703.03400
122 Fifty C.Amid E.Zhao Z.Yu T.Anil R. Finn C., Efficiently identifying task groupings for multi-task learning, Advances in Neural Information Processing Systems 34, 27503 (2021), arXiv: 2109.04617
123 Anand N.Precup D., Preferential temporal difference learning, arXiv: 2106.06508 (2021)
124 Chen K.Cao R.James S.Li Y.H. Liu Y. Abbeel P.Dou Q., Sim-to-real 6d object pose estimation via iterative self-training for robotic bin-picking, in: Computer Vision–ECCV 2022: 17th European Conference, Tel Aviv, Israel, October 23–27, 2022, Proceedings, Part XXXIX (pp 533−550). Cham: Springer Nature Switzerland, arXiv: 2204.07049
125 Mnih V.Kavukcuoglu K.Silver D.Graves A.Antonoglou I.Wierstra D.Riedmiller M., Playing atari with deep reinforcement learning, arXiv: 1312.5602 (2013)
126 P. Lillicrap T.J. Hunt J.Pritzel A. Heess N.Erez T.Tassa Y.Silver D.Wierstra D., Continuous control with deep reinforcement learning, arXiv: 1509.02971 (2015)
127 Yarats D.Brandfonbrener D.Liu H. Laskin M.Abbeel P.Lazaric A.Pinto L., Don’t change the algorithm, change the data: Exploratory data for offline reinforcement learning, arXiv: 2201.13425 (2022)
128 Ahn M.Brohan A.Brown N.Chebotar Y.Cortes O., et al.., Do as I can, not as I say: Grounding language in robotic affordances, in: Conference on Robot Learning, 2023, pp 287–318, arXiv: 2204.01691
129 James S.Abbeel P., Coarse-to-fine Q-attention with learned path ranking, arXiv: 2204.01571 (2022)
130 Qi C.Abbeel P. Grover A., Imitating, fast and slow: Robust learning from demonstrations via decision-time planning, arXiv: 2204.03597 (2022)
131 Wang L.Zhang X.Yang K.Yu L.Li C. Hong L.Zhang S.Li Z.Zhong Y.Zhu J., Memory replay with data compression for continual learning, arXiv: 2202.06592 (2022)
132 Chen L.Lu K.Rajeswaran A.Lee K.Grover A. Laskin M.Abbeel P.Srinivas A.Mordatch I., Decision transformer: Reinforcement learning via sequence modeling, Advances in Neural Information Processing Systems 34, 15084 (2021), arXiv: 2106.01345
133 Parker-Holder J.Jiang M.Dennis M. Samvelyan M.Foerster J.Grefenstette E.Rocktäschel T., Evolving curricula with regret-based environment design, in: International Conference on Machine Learning, 2022, pp 17473–17498, arXiv: 2203.01302
134 R. Wang, J. Lehman, J. Clune, and K. O. Stanley, Paired open-ended trailblazer (POET): Endlessly generating increasingly complex and diverse learning environments and their solutions, arXiv: 1901.01753 (2019)
135 Li Z.Li L. Ma Z.Zhang P. Chen J.Zhu J., Read: Large-scale neural scene rendering for autonomous driving, arXiv: 2205.05509 (2022)
136 Tang W.J. Ho C.Liu Y., Bandit learning with delayed impact of actions, in: Advances in Neural Information Processing Systems, edited by A. Beygelzimer, Y. Dauphin, P. Liang, and J. W. Vaughan, 2021, arXiv: 1904.01763
137 Gao Z.Han Y.Ren Z.Zhou Z., Batched multi-armed bandits problem, in: Advances in Neural Information Processing Systems, edited by H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, and R. Garnett, Curran Associates, Inc., 2019, arXiv: 1904.01763
138 Yue Y., Broder J., Kleinberg R., Joachims T.. The k-armed dueling bandits problem. J. Comput. Syst. Sci., 2012, 78(5): 1538
https://doi.org/10.1016/j.jcss.2011.12.028
139 Carpentier A.Lazaric A.Ghavamzadeh M.Munos R.Auer P.Antos A., Upperconfidence-bound algorithms for active learning in multi-armed bandits, in: Algorithmic Learning Theory: 22nd International Conference, ALT 2011, Espoo, Finland, October 5−7, 2011. Proceedings 22 (pp 189–203), Springer Berlin Heidelberg, arXiv: 1507.04523
140 Ye W.Liu S. Kurutach T.Abbeel P.Gao Y., Mastering Atari games with limited data, Advances in Neural Information Processing Systems 34, 25476 (2021), arXiv: 2111.00210
141 Samvelyan M.Rashid T.Schroeder de Witt C.Farquhar G.Nardelli N. G. J. Rudner T.M. Hung C.H. S. Torr P.Foerster J.Whiteson S., The StarCraft multi-agent challenge, in: Proceedings of the 18th International Conference on Autonomous Agents and MultiAgent Systems, 2019, arXiv: 1902.04043
142 Wang T.Gupta T.Mahajan A.Peng B.Whiteson S. Zhang C., Rode: Learning roles to decompose multi-agent tasks, arXiv: 2010.01523 (2020)
143 Vinyals O., Babuschkin I., M. Czarnecki W., Mathieu M., Dudzik A.. et al.. Grandmaster level in StarCraft II using multi-agent reinforcement learning. Nature, 2019, 575(7782): 350
https://doi.org/10.1038/s41586-019-1724-z
144 Du W.Ding S., A survey on multi-agent deep reinforcement learning: From the perspective of challenges and applications, Artif. Intell. Rev. 54(5), 3215 (2021)
145 Biamonte J., Wittek P., Pancotti N., Rebentrost P., Wiebe N., Lloyd S.. Quantum machine learning. Nature, 2017, 549: 195
https://doi.org/10.1038/nature23474
146 Liu Y.Arunachalam S.Temme K., A rigorous and robust quantum speed-up in supervised machine learning, Nat. Phys. 17(9), 1013 (2021)
147 Havlíček V., D. Córcoles A., Temme K., W. Harrow A., Kandala A., M. Chow J., M. Gambetta J.. Supervised learning with quantum-enhanced feature spaces. Nature, 2019, 567(7747): 209
https://doi.org/10.1038/s41586-019-0980-2
148 Moradi S., Brandner C., Spielvogel C., Krajnc D., Hillmich S., Wille R., Drexler W., Papp L.. Clinical data classification with noisy intermediate scale quantum computers. Sci. Rep., 2022, 12(1): 1851
https://doi.org/10.1038/s41598-022-05971-9
149 Zheng J.He K.Zhou J.Jin Y.M. Li C., Combining reinforcement learning with lin-kernighan-helsgaun algorithm for the traveling salesman problem, in: Proceedings of the AAAI Conference on Artificial Intelligence 35(14), 12445 (2021), arXiv: 2012.04461
150 Li Z.Chen Q. Koltun V., Combinatorial optimization with graph convolutional networks and guided tree search, Advances in Neural Information Processing Systems 31, 2018, arXiv: 1810.10659
151 Sundararajan M.Taly A.Yan Q., Axiomatic attribution for deep networks, in: International Conference on Machine Learning, 2017, pp 3319–3328, arXiv: 1703.01365
152 T. Ribeiro M.Singh S.Guestrin C., Why Should I Trust You? Explaining the predictions of any classifier, in: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2016, pp 1135−1144, arXiv: 1602.04938
153 Lundberg S.I. Lee S., A unified approach to interpreting model predictions, arXiv: 1705.07874 (2017)
154 Crabbe J.Qian Z.Imrie F.van der Schaar M., Explaining latent representations with a corpus of examples, in: Advances in Neural Information Processing Systems, edited by M. Ranzato, A. Beygelzimer, Y. Dauphin, P. Liang, and J. W. Vaughan, Curran Associates, Inc., 2021, pp 12154–12166, arXiv: 2110.15355
155 T. Springenberg J.Dosovitskiy A.Brox T. Riedmiller M., Striving for simplicity: The all convolutional net, arXiv: 1412.6806 (2014)
156 Ying R.Bourgeois D.You J.Zitnik M.Leskovec J., Gnnexplainer: Generating explanations for graph neural networks, arXiv: 1903.03894 (2019)
157 Yuan H.Yu H.Wang J.Li K.Ji S., On explainability of graph neural networks via subgraph explorations, in: International Conference on Machine Learning, 2021, pp 12241–12252, arXiv: 2102.05152
158 Huang Q.Yamada M.Tian Y.Singh D.Yin D. Chang Y., GraphLIME: Local interpretable model explanations for graph neural networks, IEEE Transactions on Knowledge and Data Engineering, 35(7), 6968 (2023), arXiv: 2001.06216
159 Yuan H.Yu H.Gui S.Ji S., Explainability in graph neural networks: A taxonomic survey, IEEE Transactions on Pattern Analysis and Machine Intelligence 45(5), 5782 (2023), arXiv: 2012.15445
160 Katz G.Barrett C.Dill D.Julian K.Kochenderfer M., ReLUPlex: An efficient smt solver for verifying deep neural networks, in: Computer Aided Verification: 29th International Conference, CAV 2017, Heidelberg, Germany, July 24−28, 2017, Proceedings, Part I 30, pp 97−117. Springer International Publishing, arXiv: 1702.01135
161 Wang S.Zhang H.Xu K.Lin X.Jana S. J. Hsieh C.Z. Kolter J., Beta-CROWN: Efficient bound propagation with per-neuron split constraints for complete and incomplete neural network verification, Advances in Neural Information Processing Systems 34, 2021, arXiv: 2103.06624
162 P. Owen M.Panken A.Moss R.Alvarez L.Leeper C., ACAS Xu: Integrated collision avoidance and detect and avoid capability for UAS, in: IEEE/AIAA 38th Digital Avionics Systems Conference (DASC), 2019
163 Mittal S.Vaishay S., A survey of techniques for optimizing deep learning on GPUS, J. Systems Archit. 99, 101635 (2019)
164 Wang F., Zhang W., Lai S., Hao M., Wang Z.. Dynamic GPU energy optimization for machine learning training workloads. IEEE Transactions on Parallel and Distributed Systems, 2022, 33(11): 2943
https://doi.org/10.1109/TPDS.2021.3137867
165 David R.Duke J.Jain A.Janapa Reddi V.Jeffries N. Li J.Kreeger N. Nappier I.Natraj M.Wang T.Warden P.Rhodes R., Tensorflow lite micro: Embedded machine learning for tinyML systems, in: Proceedings of Machine Learning and Systems, edited by A. Smola, A. Dimakis, and I. Stoica, 2021, pp 800–811, arXiv: 2010.08678
166 Tanasescu C.Kesarwani V.Inkpen D., Metaphor detection by deep learning and the place of poetic metaphor in digital humanities, in: The Thirty-First International Flairs Conference, 2018
167 Surden H., Machine learning and law, Wash. L. Rev. 89, 87 (2014)
168 De Spiegeleer J.B. Madan D.Reyners S. Schoutens W., Machine learning for quantitative finance: Fast derivative pricing, hedging and fitting, Quantitative Finance 18(10), 1635–1643, 2018
169 Solano-Alvarez W., Peet M., Pickering E., Jaiswal J., Bevan A., Bhadeshia H.. Synchrotron and neural network analysis of the influence of composition and heat treatment on the rolling contact fatigue of hypereutectoid pearlitic steels. Materials Science and Engineering A, 2017, 707: 259
https://doi.org/10.1016/j.msea.2017.09.045
170 J. Li J., Dai Y., C. Zheng J.. Strain engineering of ion migration in LiCoO2. Front. Phys., 2022, 17(1): 13503
https://doi.org/10.1007/s11467-021-1086-5
171 K. D. H. Bhadeshia H.. Neural networks and information in materials science. Statistical Analysis and Data Mining, 2009, 1: 296
https://doi.org/10.1002/sam.10018
172 Liu Y., C. Esan O., Pan Z., An L.. Machine learning for advanced energy materials. Energy and AI, 2021, 3: 100049
https://doi.org/10.1016/j.egyai.2021.100049
173 R. Kalidindi S.. Feature engineering of material structure for AI-based materials knowledge systems. J. Appl. Phys., 2020, 128(4): 041103
https://doi.org/10.1063/5.0011258
174 Xiang Z.Fan M.Vázquez Tovar G. Trehengekrn W.J. Yoon B.Qian X. Arroyave R.Qian X., Physics-constrained automatic feature engineering for predictive modeling in materials science, in: Proceedings of the AAAI Conference on Artificial Intelligence 35(12), pp 10414–10421 (2021)
175 Bengio Y., Courville A., Vincent P.. Representation learning: A review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell., 2013, 35(8): 1798
https://doi.org/10.1109/TPAMI.2013.50
176 K. Routh P., Liu Y., Marcella N., Kozinsky B., I. Frenkel A.. Latent representation learning for structural characterization of catalyst. J. Phys. Chem. Lett., 2021, 12(8): 2086
https://doi.org/10.1021/acs.jpclett.0c03792
177 Franceschetti A., Zunger A.. The inverse band-structure problem of finding an atomic configuration with given electronic properties. Nature, 1999, 402(6757): 6757
https://doi.org/10.1038/46995
178 Liu Z., Zhu D., Raju L., Cai W.. Tackling photonic inverse design with machine learning. Adv. Sci., 2021, 8: 2002923
https://doi.org/10.1002/advs.202002923
179 E. Saal J., Kirklin S., Aykol M., Meredig B., Wolverton C.. Materials design and discovery with high-throughput density functional theory: The open quantum materials database (OQMD). JOM, 2013, 65(11): 1501
https://doi.org/10.1007/s11837-013-0755-4
180 Kirklin S., E. Saal J., Meredig B., Thompson A., W. Doak J., Aykol M., Rühl S., Wolverton C.. The open quantum materials database (OQMD): Assessing the accuracy of DFT formation energies. npj Comput. Mater., 2015, 1(1): 15010
https://doi.org/10.1038/npjcompumats.2015.10
181 Jain A.P. Ong S.Hautier G.Chen W.D. Richards W.Dacek S.Cholia S.Gunter D. Skinner D.Ceder G.Persson K., The materials project: A materials genome approach to accelerating materials innovation, APL Mater. 1(1), 011002 (2013)
182 Choudhary K., F. Garrity K., C. E. Reid A., DeCost B., J. Biacchi A.. et al.. The joint automated repository for various integrated simulations (JARVIS) for data-driven materials design. npj Comput. Mater., 2020, 6(1): 173
https://doi.org/10.1038/s41524-020-00440-1
183
184
185
186
187
188 Zhou J.Shen L.D. Costa M.A. Persson K.P. Ong S. Huck P.Lu Y.Ma X.Chen Y.Tang H. P. Feng Y., 2dmatpedia, an open computational database of two-dimensional materials from top-down and bottom-up approaches, Scientific Data 6, 86, June 2019
189 Hellenbrandt M.. The inorganic crystal structure database (ICSD) — Present and future. Crystallography Rev., 2004, 10(1): 17
https://doi.org/10.1080/08893110410001664882
190 Gražulis S., Daškevič A., Merkys A., Chateigner D., Lutterotti L., Quirós M., R. Serebryanaya N., Moeck P., T. Downs R., Le Bail A.. Crystallography Open Database (COD): An open-access collection of crystal structures and platform for world-wide collaboration. Nucleic Acids Research, 2011, 40(D1): D420
https://doi.org/10.1093/nar/gkr900
191 C. Zheng J., Wu L., Zhu Y.. Aspherical electron scattering factors and their parameterizations for elements from H to Xe. Journal of Applied Crystallography, 2009, 42: 1043
https://doi.org/10.1107/S0021889809033147
192 Jumper J.Evans R.Pritzel A.Green T.Figurnov M., et al.., Highly accurate protein structure prediction with alphafold, Nature 596(7873), 583 (2021)
193 Dunn A.Wang Q.Ganose A.Dopp D.Jain A., Benchmarking materials property prediction methods: The matbench test set and automatminer reference algorithm, npj Comput. Mater. 6(1), 138 (2020)
194 Lin R., Zhang R., Wang C., Q. Yang X., L. Xin H.. Temimagenet training library and atomsegnet deep-learning models for high-precision atom segmentation, localization, denoising, and deblurring of atomic-resolution images. Sci. Rep., 2021, 11(1): 5386
https://doi.org/10.1038/s41598-021-84499-w
195 Han L., Cheng H., Liu W., Li H., Ou P., Lin R., Wang H.-T., Pao C.-W., R. Head A., Wang C.-H., Tong X., Sun C.-J., Pong W.-F., Luo J., Zheng J.-C., L. Xin H.. A single-atom library for guided monometallic and concentration-complex multimetallic designs. Nat. Mater., 2022, 21: 681
https://doi.org/10.1038/s41563-022-01252-y
196 Mrdjenovich D., K. Horton M., H. Montoya J., M. Legaspi C., Dwaraknath S., Tshitoyan V., Jain A., A. Persson K.. Propnet: A knowledge graph for materials science. Matter, 2020, 2(2): 464
https://doi.org/10.1016/j.matt.2019.11.013
197 S. Lin T., W. Coley C., Mochigase H., K. Beech H., Wang W., Wang Z., Woods E., L. Craig S., A. Johnson J., A. Kalow J., F. Jensen K., D. Olsen B.. Bigsmiles: A structurally-based line notation for describing macromolecules. ACS Cent. Sci., 2019, 5(9): 1523
https://doi.org/10.1021/acscentsci.9b00476
198 Krenn M., Ai Q., Barthel S., Carson N., Frei A.. et al.. Selfies and the future of molecular string representations. Patterns, 2022, 3(10): 100588
https://doi.org/10.1016/j.patter.2022.100588
199 Michel K., Meredig B.. Beyond bulk single crystals: A data format for all materials structure–property–processing relationships. MRS Bull., 2016, 41(8): 617
https://doi.org/10.1557/mrs.2016.166
200 Wang M.Zheng D.Ye Z.Gan Q.Li M. Song X.Zhou J.Ma C.Yu L.Gai Y. Xiao T.He T.Karypis G.Li J.Zhang Z., Deep graph library: A graph-centric, highly-performant package for graph neural networks, arXiv: 1909.01315 (2019)
201 Babuschkin I.Baumli K.Bell A. Bhupatiraju S.Bruce J., et al.., The DeepMind JAX Ecosystem, 2020
202 Chollet F., et al.., Keras, URL: github.com/fchollet/keras (2015)
203 Paszke A.Gross S.Chintala S.Chanan G.Yang E. DeVito Z.Lin Z.Desmaison A.Antiga L.Lerer A., Automatic differentiation in PYTORCH, 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA, 2017
204 Abadi M.Agarwal A.Barham P.Brevdo E.Chen Z., et al.., TensorFlow: Large-scale machine learning on heterogeneous systems, 2015, URL: www.tensorflow.org
205 Wolf T.Debut L.Sanh V.Chaumond J.Delangue C. Moi A.Cistac P.Rault T.Louf R.Funtowicz M.Davison J.Shleifer) S.Platen von Ma C.Jernite Y. Plu J.Xu C.L. Scao T.Gugger S.Drame M. Lhoest Q.M. Rush A., Huggingface’s transformers: State-of-the-art natural language processing, arXiv: 1910.03771 (2019)
206 Openrefine: A free, open source, powerful tool for working with messy data, , 2022
207 , 2022
208
209 , 2022
210 , 2020
211 Himanen L., O. Jäger M., V. Morooka E., F. Canova F., S. Ranawat Y., Z. Gao D., Rinke P., S. Foster A.. Dscribe: Library of descriptors for machine learning in materials science. Comput. Phys. Commun., 2020, 247: 106949
https://doi.org/10.1016/j.cpc.2019.106949
212 Hu W.Fey M. Zitnik M.Dong Y.Ren H.Liu B.Catasta M. Leskovec J., Open graph benchmark: Datasets for machine learning on graphs, Advances in neural information processing systems 33, 22118 (2020), arXiv: 2005.00687
213 O. Source, Rdkit: Open-source cheminformatics software, URL: www.rdkit.org, 2022
214 D. Grattarola, Spektral, URL: graphneural.network, 2022
215 Li S.Liu Y. Chen D.Jiang Y.Nie Z.Pan F., Encoding the atomic structure for machine learning in materials science, Wiley Interdiscip. Rev. Comput. Mol. Sci. 12(1) (2022)
216 Schmidt J., R. G. Marques M., Botti S., A. L. Marques M.. Recent advances and applications of machine learning in solid-state materials science. npj Comput. Mater., 2019, 5(1): 83
https://doi.org/10.1038/s41524-019-0221-0
217 Rupp M., Tkatchenko A., R. Müller K., A. von Lilienfeld O.. Fast and accurate modeling of molecular atomization energies with machine learning. Phys. Rev. Lett., 2012, 108(5): 058301
https://doi.org/10.1103/PhysRevLett.108.058301
218 Schrier J.. Can one hear the shape of a molecule (from its Coulomb matrix eigenvalues). J. Chem. Inf. Model., 2020, 60(8): 3804
https://doi.org/10.1021/acs.jcim.0c00631
219 McCarthy M., L. K. Lee K.. Molecule identification with rotational spectroscopy and probabilistic deep learning. J. Phys. Chem. A, 2020, 124(15): 3002
https://doi.org/10.1021/acs.jpca.0c01376
220 Faber F., Lindmaa A., A. von Lilienfeld O., Armiento R.. Crystal structure representations for machine learning models of formation energies. Int. J. Quantum Chem., 2015, 115(16): 1094
https://doi.org/10.1002/qua.24917
221 T. Schütt K., Glawe H., Brockherde F., Sanna A., R. Müller K., K. U. Gross E.. How to represent crystal structures for machine learning: Towards fast prediction of electronic properties. Phys. Rev. B, 2014, 89(20): 205118
https://doi.org/10.1103/PhysRevB.89.205118
222 Behler J., Parrinello M.. Generalized neural-network representation of high-dimensional potential-energy surfaces. Phys. Rev. Lett., 2007, 98(14): 146401
https://doi.org/10.1103/PhysRevLett.98.146401
223 Behler J.. Atom-centered symmetry functions for constructing high-dimensional neural network potentials. J. Chem. Phys., 2011, 134(7): 074106
https://doi.org/10.1063/1.3553717
224 Seko A., Takahashi A., Tanaka I.. Sparse representation for a potential energy surface. Phys. Rev. B, 2014, 90(2): 024101
https://doi.org/10.1103/PhysRevB.90.024101
225 Gastegger M.Schwiedrzik L.Bittermann M.Berzsenyi F.Marquetand P., WACSF — Weighted atom-centered symmetry functions as descriptors in machine learning potentials, J. Chem. Phys. 148(24), 241709 (2018)
226 P. Bartók A., Kondor R., Csányi G.. On representing chemical environments. Phys. Rev. B, 2013, 87(18): 184115
https://doi.org/10.1103/PhysRevB.87.184115
227 W. Rosenbrock C., R. Homer E., Csányi G., L. W. Hart G.. Discovering the building blocks of atomic systems using machine learning: Application to grain boundaries. npj Comput. Mater., 2017, 3: 29
https://doi.org/10.1038/s41524-017-0027-x
228 M. Paruzzo F., Hofstetter A., Musil F., De S., Ceriotti M., Emsley L.. Chemical shifts in molecular solids by machine learning. Nat. Commun., 2018, 9(1): 4501
https://doi.org/10.1038/s41467-018-06972-x
229 S. Rosen A., M. Iyer S., Ray D., Yao Z., Aspuru-Guzik A., Gagliardi L., M. Notestein J., Q. Snurr R.. Machine learning the quantum-chemical properties of metal–organic frameworks for accelerated materials discovery. Matter, 2021, 4(5): 1578
https://doi.org/10.1016/j.matt.2021.02.015
230 Fan Z., Zeng Z., Zhang C., Wang Y., Song K., Dong H., Chen Y., Ala-Nissila T.. Neuroevolution machine learning potentials: Combining high accuracy and low cost in atomistic simulations and application to heat transport. Phys. Rev. B, 2021, 104(10): 104309
https://doi.org/10.1103/PhysRevB.104.104309
231 Mihalić Z.Trinajstić N., A graph-theoretical approach to structure-property relationships, J. Chem. Educ. 69(9), 701 (1992)
232 Isayev O., Oses C., Toher C., Gossett E., Curtarolo S., Tropsha A.. Universal fragment descriptors for predicting properties of inorganic crystals. Nat. Commun., 2017, 8(1): 15679
https://doi.org/10.1038/ncomms15679
233 Xie T., C. Grossman J.. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Phys. Rev. Lett., 2018, 120(14): 145301
https://doi.org/10.1103/PhysRevLett.120.145301
234 Xia K., W. Wei G.. Persistent homology analysis of protein structure, flexibility and folding. Int. J. Numer. Methods Biomed. Eng., 2014, 30(8): 814
https://doi.org/10.1002/cnm.2655
235 Cang Z.Mu L.Wu K.Opron K.Xia K. W. Wei G., A topological approach for protein classification, Comput. Math. Biophys. 3(1) (2015)
236 Jiang Y., Chen D., Chen X., Li T., Wei G.-W., Pan F.. Topological representations of crystalline compounds for the machine-learning prediction of materials properties. npj Comput. Mater., 2021, 7: 28
https://doi.org/10.1038/s41524-021-00493-w
237 Minamitani E., Shiga T., Kashiwagi M., Obayashi I.. Topological descriptor of thermal conductivity in amorphous Si. J. Chem. Phys., 2022, 156(24): 244502
https://doi.org/10.1063/5.0093441
238 E. Aktas M., Akbas E., E. Fatmaoui A.. Persistence homology of networks: Methods and applications. Appl. Netw. Sci., 2019, 4(1): 1
https://doi.org/10.1007/s41109-019-0179-3
239 Ziletti A., Kumar D., Scheffler M., M. Ghiringhelli L.. Insightful classification of crystal structures using deep learning. Nat. Commun., 2018, 9(1): 2775
https://doi.org/10.1038/s41467-018-05169-6
240 B. Park W., Chung J., Jung J., Sohn K., P. Singh S., Pyo M., Shin N., S. Sohn K.. Classification of crystal structure using a convolutional neural network. IUCrJ, 2017, 4(4): 486
https://doi.org/10.1107/S205225251700714X
241 Zhang Y., He X., Chen Z., Bai Q., M. Nolan A., A. Roberts C., Banerjee D., Matsunaga T., Mo Y., Ling C.. Unsupervised discovery of solid-state lithium ion conductors. Nat. Commun., 2019, 10(1): 5260
https://doi.org/10.1038/s41467-019-13214-1
242 C. Sieg S., Suh C., Schmidt T., Stukowski M., Rajan K., F. Maier W.. Principal component analysis of catalytic functions in the composition space of heterogeneous catalysts. QSAR Comb. Sci., 2007, 26(4): 528
https://doi.org/10.1002/qsar.200620074
243 Tranås R., M. Løvvik O., Tomic O., Berland K.. Lattice thermal conductivity of half-Heuslers with density functional theory and machine learning: Enhancing predictivity by active sampling with principal component analysis. Comput. Mater. Sci., 2022, 202: 110938
https://doi.org/10.1016/j.commatsci.2021.110938
244 M. Ghiringhelli L., Vybiral J., Ahmetcik E., Ouyang R., V. Levchenko S., Draxl C., Scheffler M.. Learning physical descriptors for materials science by compressed sensing. New J. Phys., 2017, 19(2): 023017
https://doi.org/10.1088/1367-2630/aa57bf
245 Ouyang R.Curtarolo S.Ahmetcik E.Scheffler M.M. Ghiringhelli L., SISSO: A compressed-sensing method for identifying the best low-dimensional descriptor in an immensity of offered candidates, Phys. Rev. Mater. 2, 083802 (2018)
246 C. Lu W., B. Ji X., J. Li M., Liu L., H. Yue B., M. Zhang L.. Using support vector machine for materials design. Adv. Manuf., 2013, 1(2): 151
https://doi.org/10.1007/s40436-013-0025-2
247 Wu Y., Prezhdo N., Chu W.. Increasing efficiency of nonadiabatic molecular dynamics by Hamiltonian interpolation with kernel ridge regression. J. Phys. Chem. A, 2021, 125(41): 9191
https://doi.org/10.1021/acs.jpca.1c05105
248 Hastie T.Tibshirani R.H. Friedman J., The elements of statistical learning: Data mining, inference, and prediction, 2nd Ed., in: Springer series in statistics, NY: Springer, 2009
249 He K.Zhang X. Ren S.Sun J., Deep residual learning for image recognition, in: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp 770–778
250 T. Unke O., Chmiela S., Gastegger M., T. Schütt K., E. Sauceda H., Müller K.-R.. Spookynet: Learning force fields with electronic degrees of freedom and nonlocal effects. Nat. Commun., 2021, 12: 7273
https://doi.org/10.1038/s41467-021-27504-0
251 Zheng C., Chen C., Chen Y., P. Ong S.. Random forest models for accurate identification of coordination environments from X-ray absorption near-edge structure. Patterns, 2020, 1(2): 100013
https://doi.org/10.1016/j.patter.2020.100013
252 J. Kranz J., Kubillus M., Ramakrishnan R., A. von Lilienfeld O., Elstner M.. Generalized density-functional tight-binding repulsive potentials from unsupervised machine learning. J. Chem. Theory Comput., 2018, 14(5): 2341
https://doi.org/10.1021/acs.jctc.7b00933
253 Kim S., Noh J., H. Gu G., Aspuru-Guzik A., Jung Y.. Generative adversarial networks for crystal structure prediction. ACS Cent. Sci., 2020, 6(8): 1412
https://doi.org/10.1021/acscentsci.0c00426
254 Noh J., Kim J., S. Stein H., Sanchez-Lengeling B., M. Gregoire J., Aspuru-Guzik A., Jung Y.. Inverse design of solid-state materials via a continuous representation. Matter, 2019, 1(5): 1370
https://doi.org/10.1016/j.matt.2019.08.017
255 L. Hutchinson M.Antono E.M. Gibbons B.Paradiso S.Ling J.Meredig B., Overcoming data scarcity with transfer learning, arXiv: 1711.05099 (2017)
256 Chang R., Wang Y.-X., Ertekin E.. Towards overcoming data scarcity in materials science: Unifying models and datasets with a mixture of experts framework. npj Comput. Mater., 2022, 8: 242
https://doi.org/10.1038/s41524-022-00929-x
257 A. Nielsen M., Neural Networks and Deep Learning, Determination Press, 2015
258 Akbari A., Ng L., Solnik B.. Drivers of economic and financial integration: A machine learning approach. J. Empir. Finance, 2021, 61: 82
https://doi.org/10.1016/j.jempfin.2020.12.005
259 L. Weng, Flow-based deep generative models, URL: lilianweng.github.io, 2018
260 Raccuglia P., C. Elbert K., D. F. Adler P., Falk C., B. Wenny M., Mollo A., Zeller M., A. Friedler S., Schrier J., J. Norquist A.. Machine-learning-assisted materials discovery using failed experiments. Nature, 2016, 533(7601): 73
https://doi.org/10.1038/nature17439
261 O. Oliynyk A., A. Adutwum L., J. Harynuk J., Mar A.. Classifying crystal structures of binary compounds AB through cluster resolution feature selection and support vector machine analysis. Chem. Mater., 2016, 28(18): 6672
https://doi.org/10.1021/acs.chemmater.6b02905
262 Tang J.Cai Q.Liu Y., Prediction of material mechanical properties with support vector machine, in: 2010 International Conference on Machine Vision and Human-machine Interface, Aprl 2010, pp 592–595
263 C. Elton D., Boukouvalas Z., S. Butrico M., D. Fuge M., W. Chung P.. Applying machine learning techniques to predict the properties of energetic materials. Sci. Rep., 2018, 8(1): 9059
https://doi.org/10.1038/s41598-018-27344-x
264 Hu D., Xie Y., Li X., Li L., Lan Z.. Inclusion of machine learning kernel ridge regression potential energy surfaces in on-the-fly nonadiabatic molecular dynamics simulation. J. Phys. Chem. Lett., 2018, 9(11): 2725
https://doi.org/10.1021/acs.jpclett.8b00684
265 T. Schütt K., Arbabzadah F., Chmiela S., R. Müller K., Tkatchenko A.. Quantum-chemical insights from deep tensor neural networks. Nat. Commun., 2017, 8(1): 13890
https://doi.org/10.1038/ncomms13890
266 Jha D., Ward L., Paul A., Liao W.-K., Choudhary A., Wolverton C., Agrawal A.. Elemnet: Deep learning the chemistry of materials from only elemental composition. Sci. Rep., 2018, 8: 17593
https://doi.org/10.1038/s41598-018-35934-y
267 Jha D.Ward L.Yang Z.Wolverton C.Foster I. K. Liao W.Choudhary A.Agrawal A., IRNet: A general purpose deep residual regression framework for materials discovery, in: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp 2385–2393, 2019
268 T. Unke O., Meuwly M.. Physnet: A neural network for predicting energies, forces, dipole moments, and partial charges. J. Chem. Theo. Comput., 2019, 15(6): 3678
https://doi.org/10.1021/acs.jctc.9b00181
269 Liu Z., Lin L., Jia Q., Cheng Z., Jiang Y., Guo Y., Ma J.. Transferable multilevel attention neural network for accurate prediction of quantum chemistry properties via multitask learning. J. Chem. Inform. Model., 2021, 61(3): 1066
https://doi.org/10.1021/acs.jcim.0c01224
270 M. Krajewski A., W. Siegel J., Xu J., K. Liu Z.. Extensible structure-informed prediction of formation energy with improved accuracy and usability employing neural networks. Comput. Mater. Sci., 2022, 208: 111254
https://doi.org/10.1016/j.commatsci.2022.111254
271 T. Schütt K.J. Kindermans P.E. Sauceda H.Chmiela S.Tkatchenko A.R. Müller K., SchNet: A continuous-filter convolutional neural network for modeling quantum interactions, in Proceedings of the 31st International Conference on Neural Information Processing Systems, in NIPS’17. Red Hook, NY, USA: Curran Associates Inc., Dec. 2017, pp 992–1002
272 Jung J.. et al.. Super-resolving material microstructure image via deep learning for microstructure characterization and mechanical behavior analysis. npj Comput. Mater., 2021, 7: 96
https://doi.org/10.1038/s41524-021-00568-8
273 A. K. Farizhandi A., Betancourt O., Mamivand M.. Deep learning approach for chemistry and processing history prediction from materials microstructure. Sci. Rep., 2022, 12(1): 4552
https://doi.org/10.1038/s41598-022-08484-7
274 Xie T., C. Grossman J.. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Phys. Rev. Lett., 2018, 120(14): 145301
https://doi.org/10.1103/PhysRevLett.120.145301
275 Chen C., Ye W., Zuo Y., Zheng C., P. Ong S.. Graph networks as a universal machine learning framework for molecules and crystals. Chem. Mater., 2019, 31(9): 3564
https://doi.org/10.1021/acs.chemmater.9b01294
276 Y. Louis S., Zhao Y., Nasiri A., Wang X., Song Y., Liu F., Hu J.. Graph convolutional neural networks with global attention for improved materials property prediction. Phys. Chem. Chem. Phys., 2020, 22(32): 18141
https://doi.org/10.1039/D0CP01474E
277 Qiao Z., Welborn M., Anandkumar A., R. Manby F., F. Miller T.. OrbNet: Deep learning for quantum chemistry using symmetry-adapted atomic-orbital features. J. Chem. Phys., 2020, 153(12): 124111
https://doi.org/10.1063/5.0021955
278 Gasteiger J.Groß J.Günnemann S., Directional message passing for molecular graphs, arXiv: 2003.03123 (2020)
279 Choudhary K., DeCost B.. Atomistic line graph neural network for improved materials property predictions. npj Comput. Mater., 2021, 7(1): 185
https://doi.org/10.1038/s41524-021-00650-1
280 Zhang S.Liu Y.Xie L., Molecular mechanics-driven graph neural network with multiplex graph for molecular structures, arXiv: 2011.07457 (2020)
281 Ghorbani M., Prasad S., B. Klauda J., R. Brooks B.. GraphVAMPNet, using graph neural networks and variational approach to Markov processes for dynamical modeling of biomolecules. J. Chem. Phys., 2022, 156(18): 184103
https://doi.org/10.1063/5.0085607
282 Xie T., France-Lanord A., Wang Y., Shao-Horn Y., C. Grossman J.. Graph dynamical networks for unsupervised learning of atomic scale dynamics in materials. Nat. Commun., 2019, 10(1): 2667
https://doi.org/10.1038/s41467-019-10663-6
283 Batzner S.Musaelian A.Sun L.Geiger M.P. Mailoa J.Kornbluth M.Molinari N.E. Smidt T. Kozinsky B., E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials, Nat. Commun. 13, 2453 (2022)
284 T. Schütt K.T. Unke O.Gastegger M., Equivariant message passing for the prediction of tensorial properties and molecular spectra, in: International Conference on Machine Learning, pp 9377–9388, 2021
285 Jiang Y., Yang Z., Guo J., Li H., Liu Y., Guo Y., Li M., Pu X.. Coupling complementary strategy to flexible graph neural network for quick discovery of coformer in diverse co-crystal materials. Nat. Commun., 2021, 12(1): 5950
https://doi.org/10.1038/s41467-021-26226-7
286 W. Park C., Wolverton C.. Developing an improved crystal graph convolutional neural network framework for accelerated materials discovery. Phys. Rev. Mater., 2020, 4(6): 063801
https://doi.org/10.1103/PhysRevMaterials.4.063801
287 P. Ren G., J. Yin Y., J. Wu K., He Y.. Force field-inspired molecular representation learning for property prediction. J. Cheminform., 2023, 15(1): 17
https://doi.org/10.1186/s13321-023-00691-2
288 Chen C., P. Ong S.. AtomSets as a hierarchical transfer learning framework for small and large materials datasets. npj Comput. Mater., 2021, 7: 173
https://doi.org/10.1038/s41524-021-00639-w
289 Yamada H., Liu C., Wu S., Koyama Y., Ju S., Shiomi J., Morikawa J., Yoshida R.. Predicting materials properties with little data using shotgun transfer learning. ACS Cent. Sci., 2019, 5(10): 1717
https://doi.org/10.1021/acscentsci.9b00804
290 Feng S.Fu H.Zhou H.Wu Y.Lu Z. Dong H., A general and transferable deep learning framework for predicting phase formation in materials, npj Comput. Mater. 7(1), 10 (2021)
291 Gupta V., Choudhary K., Tavazza F., Campbell C., K. Liao W., Choudhary A., Agrawal A.. Cross-property deep transfer learning framework for enhanced predictive analytics on small materials data. Nat. Commun., 2021, 12: 6595
https://doi.org/10.1038/s41467-021-26921-5
292 Stanev V., Oses C., G. Kusne A., Rodriguez E., Paglione J., Curtarolo S., Takeuchi I.. Machine learning modeling of superconducting critical temperature. npj Comput. Mater., 2018, 4(1): 29
https://doi.org/10.1038/s41524-018-0085-8
293 S. Palmer D., M. O’Boyle N., C. Glen R., B. O. Mitchell J.. Random forest models to predict aqueous solubility. J. Chem. Inform. Model., 2007, 47(1): 150
https://doi.org/10.1021/ci060164k
294 Banerjee P., Preissner R.. Bittersweetforest: A random forest based binary classifier to predict bitterness and sweetness of chemical compounds. Front. Chem., 2018, 6: 93
https://doi.org/10.3389/fchem.2018.00093
295 Raccuglia P., C. Elbert K., D. F. Adler P., Falk C., B. Wenny M., Mollo A., Zeller M., A. Friedler S., Schrier J., J. Norquist A.. Machine-learning-assisted materials discovery using failed experiments. Nature, 2016, 533(7601): 7601
https://doi.org/10.1038/nature17439
296 Chen L., Xu B., Chen J., Bi K., Li C., Lu S., Hu G., Lin Y.. Ensemble-machine-learning-based correlation analysis of internal and band characteristics of thermoelectric materials. J. Mater. Chem. C, 2020, 8(37): 13079
https://doi.org/10.1039/D0TC02855J
297 Venderley J., Mallayya K., Matty M., Krogstad M., Ruff J., Pleiss G., Kishore V., Mandrus D., Phelan D., Poudel L., G. Wilson A., Weinberger K., Upreti P., Norman M., Rosenkranz S., Osborn R., A. Kim E.. Harnessing interpretable and unsupervised machine learning to address big data from modern X-ray diffraction. Proc. Natl. Acad. Sci. USA, 2022, 119(24): e2109665119
https://doi.org/10.1073/pnas.2109665119
298 Cohn R., Holm E.. Unsupervised machine learning via transfer learning and k-means clustering to classify materials image data. Integr. Mater. Manuf. Innov., 2021, 10(2): 231
https://doi.org/10.1007/s40192-021-00205-8
299 E. A. Goodall R., A. Lee A.. Predicting materials properties without crystal structure: Deep representation learning from stoichiometry. Nat. Commun., 2020, 11(1): 6280
https://doi.org/10.1038/s41467-020-19964-7
300 Muraoka K., Sada Y., Miyazaki D., Chaikittisilp W., Okubo T.. Linking synthesis and structure descriptors from a large collection of synthetic records of zeolite materials. Nat. Commun., 2019, 10(1): 4459
https://doi.org/10.1038/s41467-019-12394-0
301 Jha D., Choudhary K., Tavazza F., Liao W., Choudhary A., Campbell C., Agrawal A.. Enhancing materials property prediction by leveraging computational and experimental data using deep transfer learning. Nat. Commun., 2019, 10(1): 5316
https://doi.org/10.1038/s41467-019-13297-w
302 Zhong X., Gallagher B., Liu S., Kailkhura B., Hiszpanski A., Y.-J. Han T.. Explainable machine learning in materials science. npj Comput. Mater., 2022, 8: 204
https://doi.org/10.1038/s41524-022-00884-7
303 Linardatos P., Papastefanopoulos V., Kotsiantis S.. Explainable AI: A review of machine learning interpretability methods. Entropy (Basel), 2020, 23(1): 18
https://doi.org/10.3390/e23010018
304 J. Murdoch W., Singh C., Kumbier K., Abbasi-Asl R., Yu B.. Definitions, methods, and applications in interpretable machine learning. Proc. Natl. Acad. Sci. USA, 2019, 116(44): 22071
https://doi.org/10.1073/pnas.1900654116
305 Kondo R., Yamakawa S., Masuoka Y., Tajima S., Asahi R.. Microstructure recognition using convolutional neural networks for prediction of ionic conductivity in ceramics. Acta Mater., 2017, 141: 29
https://doi.org/10.1016/j.actamat.2017.09.004
306 Das K., Samanta B., Goyal P., Lee S.-C., Bhattacharjee S., Ganguly N.. CrysXPP: An explainable property predictor for crystalline materials. npj Comput. Mater., 2022, 8: 43
https://doi.org/10.1038/s41524-022-00716-8
307 Y. T. Wang A., K. Kauwe S., J. Murdock R., D. Sparks T.. Compositionally restricted attention-based network for materials property predictions. npj Comput. Mater., 2021, 7(1): 77
https://doi.org/10.1038/s41524-021-00545-1
308 Y. T. Wang A., S. Mahmoud M., Czasny M., Gurlo A.. CrabNet for explainable deep learning in materials science: bridging the gap between academia and industry. Integr. Mater. Manuf. Innov., 2022, 11(1): 41
https://doi.org/10.1007/s40192-021-00247-y
309 Parnami A.Lee M., Learning from few examples: A summary of approaches to few-shot learning, arXiv: 2203.04291 (2023)
310 Wang Y., Yao Q., T. Kwok J., M. Ni L.. Generalizing from a few examples: A survey on few-shot learning. ACM Comput. Surv., 2020, 53(3): 63
https://doi.org/10.1145/3386252
311 Wang Y.Abuduweili A.Yao Q.Dou D., Property-aware relation networks for few-shot molecular property prediction, arXiv: 2107.07994 (2021)
312 Guo Z., et al.., Few-shot graph learning for molecular property prediction, in: Proceedings of the Web Conference 2021, in: WWW ’21. New York, USA: Association for Computing Machinery, June 2021, pp 2559–2567
313 Kaufmann K., Lane H., Liu X., S. Vecchio K.. Efficient few-shot machine learning for classification of EBSD patterns. Sci. Rep., 2021, 11(1): 8172
https://doi.org/10.1038/s41598-021-87557-5
314 Akers S.. et al.. Rapid and flexible segmentation of electron microscopy data using few-shot machine learning. npj Comput. Mater., 2021, 7: 187
https://doi.org/10.1038/s41524-021-00652-z
315 P. Perdew J., Schmidt K.. Jacob’s ladder of density functional approximations for the exchange-correlation energy. AIP Conf. Proc., 2001, 577: 1
https://doi.org/10.1063/1.1390175
316 Dick S., Fernandez-Serra M.. Machine learning accurate exchange and correlation functionals of the electronic density. Nat. Commun., 2020, 11(1): 3509
https://doi.org/10.1038/s41467-020-17265-7
317 Nagai R., Akashi R., Sugino O.. Completing density functional theory by machine learning hidden messages from molecules. npj Comput. Mater., 2020, 6(1): 43
https://doi.org/10.1038/s41524-020-0310-0
318 Kirkpatrick J., McMorrow B., H. P. Turban D., L. Gaunt A., S. Spencer J., G. D. G. Matthews A., Obika A., Thiry L., Fortunato M., Pfau D., R. Castellanos L., Petersen S., W. R. Nelson A., Kohli P., Mori-Sánchez P., Hassabis D., J. Cohen A.. Pushing the frontiers of density functionals by solving the fractional electron problem. Science, 2021, 374(6573): 1385
https://doi.org/10.1126/science.abj6511
319 C. Snyder J., Rupp M., Hansen K., R. Müller K., Burke K.. Finding density functionals with machine learning. Phys. Rev. Lett., 2012, 108(25): 253002
https://doi.org/10.1103/PhysRevLett.108.253002
320 Lei X., J. Medford A.. Design and analysis of machine learning exchange-correlation functionals via rotationally invariant convolutional descriptors. Phys. Rev. Mater., 2019, 3(6): 063801
https://doi.org/10.1103/PhysRevMaterials.3.063801
321 Fan Z.Wang Y.Ying P.Song K.Wang J. Wang Y.Zeng Z.Xu K.Lindgren E.M. Rahm J. J. Gabourie A.Liu J.Dong H. Wu J.Chen Y. Zhong Z.Sun J.Erhart P.Su Y.Ala-Nissila T., GPUMD: A package for constructing accurate machine-learned potentials and performing highly efficient atomistic simulations, J. Chem. Phys. 157(11), 114801 (2022)
322 Wang H., Zhang L., Han J., E W.. DeePMD-kit: A deep learning package for many-body potential energy representation and molecular dynamics. Comput. Phys. Commun., 2018, 228: 178
https://doi.org/10.1016/j.cpc.2018.03.016
323 Zhang Y.Wang H.Chen W.Zeng J.Zhang L. Wang H.E W., DP-GEN: A concurrent learning platform for the generation of reliable deep learning based potential energy models, Comput. Phys. Commun. 253, 107206 (2020)
324 Pattnaik P., Raghunathan S., Kalluri T., Bhimalapuram P., V. Jawahar C., D. Priyakumar U.. Machine learning for accurate force calculations in molecular dynamics simulations. J. Phys. Chem. A, 2020, 124(34): 6954
https://doi.org/10.1021/acs.jpca.0c03926
325 Westermayr J., Marquetand P.. Machine learning and excited-state molecular dynamics. Mach. Learn.: Sci. Technol., 2020, 1(4): 043001
https://doi.org/10.1088/2632-2153/ab9c3e
326 Fan G., McSloy A., Aradi B., Y. Yam C., Frauenheim T.. Obtaining electronic properties of molecules through combining density functional tight binding with machine learning. J. Phys. Chem. Lett., 2022, 13(43): 10132
https://doi.org/10.1021/acs.jpclett.2c02586
327 Ahmad Z., Xie T., Maheshwari C., C. Grossman J., Viswanathan V.. Machine learning enabled computational screening of inorganic solid electrolytes for suppression of dendrite formation in lithium metal anodes. ACS Cent. Sci., 2018, 4(8): 996
https://doi.org/10.1021/acscentsci.8b00229
328 Gong S., Wang S., Zhu T., Chen X., Yang Z., J. Buehler M., Shao-Horn Y., C. Grossman J.. Screening and understanding Li adsorption on two-dimensional metallic materials by learning physics and physics-simplified learning. JACS Au, 2021, 1(11): 1904
https://doi.org/10.1021/jacsau.1c00260
329 Xie T., France-Lanord A., Wang Y., Lopez J., A. Stolberg M., Hill M., M. Leverick G., Gomez-Bombarelli R., A. Johnson J., Shao-Horn Y., C. Grossman J.. Accelerating amorphous polymer electrolyte screening by learning to reduce errors in molecular dynamics simulated properties. Nat. Commun., 2022, 13(1): 3415
https://doi.org/10.1038/s41467-022-30994-1
330 Gubaev K., V. Podryabinkin E., L. Hart G., V. Shapeev A.. Accelerating high-throughput searches for new alloys with active learning of interatomic potentials. Comput. Mater. Sci., 2019, 156: 148
https://doi.org/10.1016/j.commatsci.2018.09.031
331 Xie T.Fu X.E. Ganea O.Barzilay R.Jaakkola T., Crystal diffusion variational autoencoder for periodic material generation, arXiv: 2110.06197 (2021)
332 Dong Y., Li D., Zhang C., Wu C., Wang H., Xin M., Cheng J., Lin J.. Inverse design of two-dimensional graphene/h-BN hybrids by a regressional and conditional GAN. Carbon, 2020, 169: 9
https://doi.org/10.1016/j.carbon.2020.07.013
333 Pathak Y., S. Juneja K., Varma G., Ehara M., D. Priyakumar U.. Deep learning enabled inorganic material generator. Phys. Chem. Chem. Phys., 2020, 22(46): 26935
https://doi.org/10.1039/D0CP03508D
334 Suzuki Y., Hino H., Hawai T., Saito K., Kotsugi M., Ono K.. Symmetry prediction and knowledge discovery from X-ray diffraction patterns using an interpretable machine learning approach. Sci. Rep., 2020, 10(1): 21790
https://doi.org/10.1038/s41598-020-77474-4
335 A. Enders A., M. North N., M. Fensore C., Velez-Alvarez J., C. Allen H.. Functional group identification for FTIR spectra using image-based machine learning models. Anal. Chem., 2021, 93(28): 9711
https://doi.org/10.1021/acs.analchem.1c00867
336 Huang B., Li Z., Li J.. An artificial intelligence atomic force microscope enabled by machine learning. Nanoscale, 2018, 10(45): 21320
https://doi.org/10.1039/C8NR06734A
337 Chandrashekar A., Belardinelli P., A. Bessa M., Staufer U., Alijani F.. Quantifying nanoscale forces using machine learning in dynamic atomic force microscopy. Nanoscale Adv., 2022, 4(9): 2134
https://doi.org/10.1039/D2NA00011C
338 V. Kalinin S., Ophus C., M. Voyles P., Erni R., Kepaptsoglou D., Grillo V., R. Lupini A., P. Oxley M., Schwenker E., K. Y. Chan M., Etheridge J., Li X., G. D. Han G., Ziatdinov M., Shibata N., J. Pennycook S.. Machine learning in scanning transmission electron microscopy. Nat. Rev. Methods Primers, 2022, 2(1): 11
https://doi.org/10.1038/s43586-022-00095-w
339 Jung J.. et al.. Super-resolving material microstructure image via deep learning for microstructure characterization and mechanical behavior analysis. npj Comput. Mater., 2021, 7: 96
https://doi.org/10.1038/s41524-021-00568-8
340 Floridi L.Chiriatti M., GPT-3: Its nature, scope, limits, and consequences, Minds Mach. 30(4), 681 (2020)
341 OpenAI, GPT-4 Technical Report, arXiv: 2303.08774 (2023)
342 M. Katz D.J. Bommarito M.Gao S.Arredondo P., GPT-4 passes the bar exam, Rochester, NY, Mar. 15, 2023
343 Tshitoyan V., Dagdelen J., Weston L., Dunn A., Rong Z., Kononova O., A. Persson K., Ceder G., Jain A.. Unsupervised word embeddings capture latent knowledge from materials science literature. Nature, 2019, 571(7763): 95
https://doi.org/10.1038/s41586-019-1335-8
344 A. Olivetti E., M. Cole J., Kim E., Kononova O., Ceder G., Y.-J. Han T., M. Hiszpanski A.. Data-driven materials research enabled by natural language processing and information extraction. Appl. Phys. Rev., 2020, 7(4): 041317
https://doi.org/10.1063/5.0021106
345 Shetty P., Ramprasad R.. Automated knowledge extraction from polymer literature using natural language processing. iScience, 2021, 24(1): 101922
https://doi.org/10.1016/j.isci.2020.101922
346 Davies A., Veličković P., Buesing L., Blackwell S., Zheng D., Tomašev N., Tanburn R., Battaglia P., Blundell C., Juhász A., Lackenby M., Williamson G., Hassabis D., Kohli P.. Advancing mathematics by guiding human intuition with AI. Nature, 2021, 600(7887): 70
https://doi.org/10.1038/s41586-021-04086-x
347 E. Karniadakis G., G. Kevrekidis I., Lu L., Perdikaris P., Wang S., Yang L.. Physics informed machine learning. Nat. Rev. Phys., 2021, 3(6): 422
https://doi.org/10.1038/s42254-021-00314-5
348 Goyal A.Bengio Y., Inductive biases for deep learning of higher-level cognition, Proc. R. Soc. A 478(2266), 20210068 (2022)
349 Baker B.Akkaya I.Zhokhov P.Huizinga J.Tang J. Ecoffet A.Houghton B.Sampedro R. Clune J., Video pretraining (VPT): Learning to act by watching unlabeled online videos, Advances in Neural Information Processing Systems 35, 24639 (2022)
350 Lehman J.Gordon J.Jain S.Ndousse K.Yeh C. O. Stanley K., Evolution through large models, arXiv: 2206.08896 (2022)
351 S. Anis M., et al.., Qiskit: An open-source framework for quantum computing, 2021
352 Wu C., Wu F., Lyu L., Huang Y., Xie X.. Communication-efficient federated learning via knowledge distillation. Nat. Commun., 2022, 13: 2032
https://doi.org/10.1038/s41467-022-29763-x
353 G. Yu H.. Neural network iterative diagonalization method to solve eigenvalue problems in quantum mechanics. Phys. Chem. Chem. Phys., 2015, 17(21): 14071
https://doi.org/10.1039/C5CP01438G
354 K. Ghosh S., Ghosh D.. Machine learning matrix product state Ansatz for strongly correlated systems. J. Chem. Phys., 2023, 158(6): 064108
https://doi.org/10.1063/5.0133399
355 C. H. Nguyen P., B. Choi J., S. Udaykumar H., Baek S.. Challenges and opportunities for machine learning in multiscale computational modeling. J. Comput. Inf. Sci. Eng., 2023, 23(6): 060808
https://doi.org/10.1115/1.4062495
356 Wahab H., Jain V., S. Tyrrell A., A. Seas M., Kotthoff L., A. Johnson P.. Machine-learning-assisted fabrication: Bayesian optimization of laser-induced graphene patterning using in-situ Raman analysis. Carbon, 2020, 167: 609
https://doi.org/10.1016/j.carbon.2020.05.087
357 Tayyebi A., S. Alshami A., Yu X., Kolodka E.. Can machine learning methods guide gas separation membranes fabrication. J. Membrane Sci. Lett., 2022, 2(2): 100033
https://doi.org/10.1016/j.memlet.2022.100033
358 T. Chen Y., Duquesnoy M., H. S. Tan D., M. Doux J., Yang H., Deysher G., Ridley P., A. Franco A., S. Meng Y., Chen Z.. Fabrication of high-quality thin solid-state electrolyte films assisted by machine learning. ACS Energy Lett., 2021, 6(4): 1639
https://doi.org/10.1021/acsenergylett.1c00332
359 Li W., Liang L., Zhao S., Zhang S., Xue J.. Fabrication of nanopores in a graphene sheet with heavy ions: A molecular dynamics study. J. Appl. Phys., 2013, 114(23): 234304
https://doi.org/10.1063/1.4837657
360 L. Safina L., A. Baimova J.. Molecular dynamics simulation of fabrication of Ni-graphene composite: Temperature effect. Micro & Nano Lett., 2020, 15(3): 176
https://doi.org/10.1049/mnl.2019.0414
361 Zhao B., Shen C., Yan H., Xie J., Liu X., Dai Y., Zhang J., Zheng J., Wu L., Zhu Y., Jiang Y.. Constructing uniform oxygen defect engineering on primary particle level for high-stability lithium-rich cathode materials. Chem. Eng. J., 2023, 465: 142928
https://doi.org/10.1016/j.cej.2023.142928
362 X. Liao X., Q. Wang H., C. Zheng J.. Tuning the structural, electronic, and magnetic properties of strontium titanate through atomic design: A comparison between oxygen vacancies and nitrogen doping. J. Am. Ceram. Soc., 2013, 96(2): 538
https://doi.org/10.1111/jace.12072
363 Xing H., Q. Wang H., Song T., Li C., Dai Y., Fu G., Kang J., C. Zheng J.. Electronic and thermal properties of Ag-doped single crystal zinc oxide via laser-induced technique. Chin. Phys. B, 2023, 32(6): 066107
https://doi.org/10.1088/1674-1056/acae74
364 Wu L., C. Zheng J., Zhou J., Li Q., Yang J., Zhu Y.. Nanostructures and defects in thermoelectric AgPb18SbTe20 single crystal. J. Appl. Phys., 2009, 105(9): 094317
https://doi.org/10.1063/1.3124364
365 Zeng H., Wu M., Q. Wang H., C. Zheng J., Kang J.. Tuning the magnetic and electronic properties of strontium titanate by carbon doping. Front. Phys., 2021, 16(4): 43501
https://doi.org/10.1007/s11467-020-1034-9
366 Li D., Q. Wang H., Zhou H., P. Li Y., Huang Z., C. Zheng J., O. Wang J., Qian H., Ibrahim K., Chen X., Zhan H., Zhou Y., Kang J.. Influence of nitrogen and magnesium doping on the properties of ZnO films. Chin. Phys. B, 2016, 25(7): 076105
https://doi.org/10.1088/1674-1056/25/7/076105
367 Wang R., C. Zheng J.. Promising transition metal decorated borophene catalyst for water splitting. RSC Advances, 2023, 13(14): 9678
https://doi.org/10.1039/D3RA00299C
368 He J., D. Zhao L., C. Zheng J., W. Doak J., Wu H., Q. Wang H., Lee Y., Wolverton C., G. Kanatzidis M., P. Dravid V.. Role of sodium doping in lead chalcogenide thermoelectrics. J. Am. Chem. Soc., 2013, 135(12): 4624
https://doi.org/10.1021/ja312562d
369 D. Cooley L., J. Zambano A., R. Moodenbaugh A., F. Klie R., C. Zheng J., Zhu Y., of two-band superconductivity at the critical electron doping of (Mg Inversion. Al)B2. Phys. Rev. Lett., 2005, 95(26): 267002
https://doi.org/10.1103/PhysRevLett.95.267002
370 Yan H., Wang T., Liu L., Song T., Li C., Sun L., Wu L., C. Zheng J., Dai Y.. High voltage stable cycling of all-solid-state lithium metal batteries enabled by top-down direct fluorinated poly (ethylene oxide)-based electrolytes. J. Power Sources, 2023, 557: 232559
https://doi.org/10.1016/j.jpowsour.2022.232559
371 C. Zheng J., H. A. Huan C., T. S. Wee A., Z. Wang R., M. Zheng Y.. Ground-state properties of cubic C-BN solid solutions. J. Phys.: Condens. Matter, 1999, 11(3): 927
https://doi.org/10.1088/0953-8984/11/3/030
372 Huang Z., Y. Lü T., Q. Wang H., W. Yang S., C. Zheng J., properties of the group-III nitrides (BN Electronic. AlN and GaN) atomic sheets under biaxial strains. Comput. Mater. Sci., 2017, 130: 232
https://doi.org/10.1016/j.commatsci.2017.01.013
373 Y. Lü T., X. Liao X., Q. Wang H., C. Zheng J., the indirect–direct band gap transition of SiC Tuning. GeC and SnC monolayer in a graphene-like honeycomb structure by strain engineering: A quasiparticle GW study. J. Mater. Chem., 2012, 22(19): 10062
https://doi.org/10.1039/c2jm30915g
374 C. Zheng J., W. Davenport J.. Ferromagnetism and stability of half-metallic MnSb and MnBi in the strained zinc-blende structure: Predictions from full potential and pseudopotential calculations. Phys. Rev. B, 2004, 69(14): 144415
https://doi.org/10.1103/PhysRevB.69.144415
375 Xu L., Q. Wang H., C. Zheng J., properties of PbTe Thermoelectric. SnTe, and GeTe at high pressure: An ab initio study. J. Electron. Mater., 2011, 40(5): 641
https://doi.org/10.1007/s11664-010-1491-y
376 Xu L., Zheng Y., C. Zheng J.. Thermoelectric transport properties of PbTe under pressure. Phys. Rev. B, 2010, 82(19): 195102
https://doi.org/10.1103/PhysRevB.82.195102
377 C. Zheng J., hexagonal transition metal Superhard, carbide its. OsC, and OsN. Phys. Rev. B, 2005, 72(5): 052105
https://doi.org/10.1103/PhysRevB.72.052105
378 Sun T., Umemoto K., Wu Z., C. Zheng J., M. Wentzcovitch R.. Lattice dynamics and thermal equation of state of platinum. Phys. Rev. B, 2008, 78(2): 024304
https://doi.org/10.1103/PhysRevB.78.024304
379 Wu Z., M. Wentzcovitch R., Umemoto K., Li B., Hirose K., C. Zheng J.. Pressure-volume-temperature relations in MgO: An ultrahigh pressure-temperature scale for planetary sciences applications. J. Geophys. Res., 2008, 113(B6): B06204
https://doi.org/10.1029/2007JB005275
380 Deng S., Wu L., Cheng H., C. Zheng J., Cheng S., Li J., Wang W., Shen J., Tao J., Zhu J., Zhu Y.. Charge-lattice coupling in hole-doped LuFe2O4+δ: The origin of second-order modulation. Phys. Rev. Lett., 2019, 122(12): 126401
https://doi.org/10.1103/PhysRevLett.122.126401
381 C. Zheng J., Wu L., Zhu Y., W. Davenport J.. On the sensitivity of electron and X-ray scattering factors to valence charge distribution. J. Appl. Crystall., 2005, 38: 648
https://doi.org/10.1107/S0021889805016109
382 C. Zheng J., Q. Wang H.. Principles and applications of a comprehensive characterization method combining synchrotron radiation technology, transmission electron microscopy, and density functional theory. Sci. Sin. - Phys. Mech. & Astron., 2021, 51(3): 030007
https://doi.org/10.1360/SSPMA-2020-0441
[1] Yongjia Wang, Qingfeng Li. Machine learning transforms the inference of the nuclear equation of state[J]. Front. Phys. , 2023, 18(6): 64402-.
[2] Jinyu Wan, Yi Jiao. Machine learning-based direct solver for one-to-many problems on temporal shaping of relativistic electron beams[J]. Front. Phys. , 2022, 17(6): 64601-.
[3] Cuiqian Yu, Yulou Ouyang, Jie Chen. Enhancing thermal transport in multilayer structures: A molecular dynamics study on Lennard−Jones solids[J]. Front. Phys. , 2022, 17(5): 53507-.
[4] Daryl Ryan Chong, Minhyuk Kim, Jaewook Ahn, Heejeong Jeong. Machine learning identification of symmetrized base states of Rydberg atoms[J]. Front. Phys. , 2022, 17(1): 12504-.
[5] Yulou Ouyang, Cuiqian Yu, Gang Yan, Jie Chen. Machine learning approach for the prediction and optimization of thermal transport properties[J]. Front. Phys. , 2021, 16(4): 43200-.
[6] Wen Tong, Qun Wei, Hai-Yan Yan, Mei-Guang Zhang, Xuan-Min Zhu. Accelerating inverse crystal structure prediction by machine learning: A case study of carbon allotropes[J]. Front. Phys. , 2020, 15(6): 63501-.
[7] Ce Wang, Hui Zhai. Machine learning of frustrated classical spin models (II): Kernel principal component analysis[J]. Front. Phys. , 2018, 13(5): 130507-.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed