Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

SAGRAD: A Program for Neural Network Training with Simulated Annealing and the Conjugate Gradient Method

Published

Author(s)

Javier Bernal, Jose Torres-Jimenez

Abstract

SAGRAD, a Fortran 77 program for computing neural networks for classification using batch learning, is discussed. Neural network training in SAGRAD is based on a combination of simulated annealing and Møller's scaled conjugate gradient algorithm, the latter a variation of the traditional conjugate gradient method, better suited for the nonquadratic nature of neural networks. Different aspects of the implementation of the training process in SAGRAD are discussed such as the efficient computation of gradients and multiplication of vectors by Hessian matrices that are required by Møller’s algorithm; the (re)initialization of weights with simulated annealing before each execution of Møller's algorithm; and the use of simulated annealing when Møller's algorithm, after possibly making considerable progress, becomes stuck at a local minimum or flat area of weight space. Outlines of scaled conjugate gradient algorithm, simulated annealing procedure and training process used in SAGRAD are presented together with results from running SAGRAD on two examples of training data.
Citation
Journal of Research (NIST JRES) -
Volume
120

Keywords

Neural Works, Simulated Annealing, Scaled Conjugate Gradient Method

Citation

Bernal, J. and Torres-Jimenez, J. (2015), SAGRAD: A Program for Neural Network Training with Simulated Annealing and the Conjugate Gradient Method, Journal of Research (NIST JRES), National Institute of Standards and Technology, Gaithersburg, MD, [online], https://doi.org/10.6028/jres.120.009 (Accessed July 15, 2024)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created June 17, 2015, Updated November 10, 2018