Neural Networks, IEEE - INNS - ENNS International Joint Conference on
Download PDF

Abstract

In this paper, we introduce an advanced optimization algorithm for training feedforward neural networks. The algorithm combines the BFGS Hessian update formula with a special case of trust region techniques, called the Dogleg method, as an altenative technique to line search methods. Simulations regarding classification and function approximation problems are presented which reveal a clear improvement both in convergence and success rates over standard BFGS implementations.
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!