1Ebert T, Banfer O, Nelles O. Multilayer perceptron network with modified sigmoid activation functions [G]. LNCS 6319: Artificial Intelligence and Computational Intelligence. Berlin: Springer Berlin Heidelberg, 2010: 414-421.
2Karlik B, Olgae AV. Performance analysis of various activa- tion functions in generalized MLP architectures of neural net- works [J]. International Journal of Artificial Intelligence and Expert Systems, 2010, 1 (4): 111-122.
3Glorot X, Bordes A, Bengio Y. Domain adaptation for large-scalesentiment classification: A deep learning approach [C] //Proceedings of the 28th International Conference on Machine Learning, 2011.
4Glorot X, Bengio Y. Understanding the difficulty of training deep feedforward neural networks [C] //Proceedings of the International Conference on Artificial Intelligence and Statistics, 2010.
5Graves A, Mohamed AR, Hinton G. Speech recognition with deep recurrent neural networks [C] //IEEE International Con- ference on Acoustics, Speech and Signal Processing, 2013.
6Pradhan 13, Lee S. Regional landslide susceptibility analysis using back-propagation neural network model at Cameron High- land, Malaysia [J]. Landslides, 2010, 7 (1): 13-30.
7Vincent P, Larochelle H, Lajoie I, et al. Stacked denoising autoencoders: Learning useful representations in a deep net- work with a local denoising criterion [J]. The Journal of Ma- chine LearningResearch, 2010, 11; 3371-3408.
8Bouchard G. Efficient bounds for the softmax function and ap- plcations to approximate inference in hybrid models [C] // Proceedings of the NIPS Workshop for Approximate Bayesian Inference in Continuous/Hybrid Systems, 2007.
9DENG L. The MNIST database of handwritten digit images for machine learning research [J]. IEEE Signal Processing Magazine, 2012, 29 (6): 141-142.