Latest YouTube Video

Thursday, November 17, 2016

Designing and Training Feedforward Neural Networks: A Smooth Optimisation Perspective. (arXiv:1611.05827v1 [cs.LG])

Despite the recent great success of deep neural networks in various applications, designing and training a deep neural network is still among the greatest challenges in the field. In this work, we present a smooth optimisation perspective on designing and training multilayer Feedforward Neural Networks (FNNs) in the supervised learning setting. By characterising the critical point conditions of an FNN based optimisation problem, we identify the conditions to eliminate local optima of the corresponding cost function. Moreover, by studying the Hessian structure of the cost function at the global minima, we develop an approximate Newton FNN algorithm, which is capable of alleviating the vanishing gradient problem. Finally, our results are numerically verified on two classic benchmarks, i.e., the XOR problem and the four region classification problem.



from cs.AI updates on arXiv.org http://ift.tt/2g0KznG
via IFTTT

No comments: