No CrossRef data available.
Article contents
Neural Networks for Computational Chemistry: Pitfalls and Recommendations
Published online by Cambridge University Press: 21 February 2013
Abstract
There is a long history of using neural networks for function approximation in computational physics and chemistry. Despite their conceptual simplicity, the practitioner may face difficulties when it comes to putting them to work. This small guide intends to pinpoint some neural networks pitfalls, along with corresponding solutions to successfully realize function approximation tasks in physics, chemistry or other fields.
- Type
- Articles
- Information
- MRS Online Proceedings Library (OPL) , Volume 1523: Symposium QQ – Materials Informatics , 2013 , mrsf12-1523-qq03-05
- Copyright
- Copyright © Materials Research Society 2013
References
REFERENCES
Lorenz, Sönke, Groß, Axel, and Scheffler, Matthias. Representing high-dimensional potential-energy surfaces for reactions at surfaces by neural networks. Chemical Physics Letters, 395(4–6):210–215, 2004.CrossRefGoogle Scholar
Manzhos, Sergei and Carrington, Tucker. A random-sampling high dimensional model representation neural network for building potential energy surfaces. J. Chem. Phys., 125:084109, 2006.10.1063/1.2336223CrossRefGoogle ScholarPubMed
Behler, Jörg. Neural network potential-energy surfaces in chemistry: a tool for large-scale simulations. Physical Chemistry Chemical Physics, 13(40):17930–17955, 2011.10.1039/c1cp21668fCrossRefGoogle ScholarPubMed
Snyder, John C., Rupp, Matthias, Hansen, Katja, Müller, Klaus-Robert, and Burke, Kieron. Finding Density Functionals with Machine Learning, Physical Review Letters, 108(25): 253002, American Physical Society, 2012.CrossRefGoogle ScholarPubMed
Rupp, Matthias, Tkatchenko, Alexandre, Müller, Klaus-Robert, and Anatole von Lilienfeld, O.. Fast and Accurate Modeling of Molecular Atomization Energies with Machine Learning, Physical Review Letters, 108(5):058301, 2012.10.1103/PhysRevLett.108.058301CrossRefGoogle ScholarPubMed
Montavon, Grégoire, Hansen, Katja, Fazli, Siamac, Rupp, Matthias, Biegler, Franziska, Ziehe, Andreas, Tkatchenko, Alexandre, Anatole von Lilienfeld, O., and Müller, Klaus-Robert. Learning Invariant Representations of Molecules for Atomization Energy Prediction, Advances in Neural Information Processing Systems
25, 449–457, 2012.Google Scholar
Rumelhart, David E., Hinton, Geoffrey E., and Williams, Ronald J.. Learning representations by backpropagating errors, Nature, 323, 533–536, 1986.10.1038/323533a0CrossRefGoogle Scholar
Bottou, Léon. Stochastic Gradient Learning in Neural Networks, Proceedings of Neuro-Nîmes
91, EC2, Nimes, France, 1991.Google Scholar
LeCun, Yann, Bottou, Léon, Orr, Geneviève B., and Müller, Klaus-Robert. Efficient BackProp, in Orr, G. B., and Müller, K-R. (Eds), Neural Networks: Tricks of the trade, Springer, 1998.Google Scholar
Montavon, Grégoire, Orr, Geneviève B., and Müller, Klaus-Robert (Eds). Neural Networks: Tricks of the Trade. 2nd edn, LNCS 7700, Springer, 2012.10.1007/978-3-642-35289-8CrossRefGoogle Scholar
Montavon, Grégoire, Braun, Mikio L., and Müller, Klaus-Robert. Kernel analysis of deep networks. Journal of Machine Learning Research, 12:2563–2581, 2011.Google Scholar
Blum, Lorenz C. and Reymond, Jean-Louis. 970 million druglike small molecules for virtual screening in the chemical universe database GDB-13. Journal of the American Chemical Society, 131(25):8732–8733, 2009.10.1021/ja902302hCrossRefGoogle ScholarPubMed
Rupp, Matthias, Tkatchenko, Alexandre, Müller, Klaus-Robert, and Anatole von Lilienfeld, O.. Reply to Comment by J.E. Moussa (Physical Review Letters 109(5): 059801, 2012), Physical Review Letters, 109(5): 059802, American Physical Society, 2012.10.1103/PhysRevLett.109.059802CrossRefGoogle Scholar