Optimization in an Error Backpropagation Neural Network Environment with a Performance Test on a Pattern Classification Problem

    Publication: Working/Discussion PaperWU Working Paper

    8 Downloads (Pure)

    Abstract

    Various techniques of optimizing the multiple class cross-entropy error function
    to train single hidden layer neural network classifiers with softmax output transfer
    functions are investigated on a real-world multispectral pixel-by-pixel classification
    problem that is of fundamental importance in remote sensing. These techniques
    include epoch-based and batch versions of backpropagation of gradient descent,
    PR-conjugate gradient and BFGS quasi-Newton errors. The method of choice
    depends upon the nature of the learning task and whether one wants to optimize
    learning for speed or generalization performance. It was found that, comparatively
    considered, gradient descent error backpropagation provided the best and most stable
    out-of-sample performance results across batch and epoch-based modes of operation.
    If the goal is to maximize learning speed and a sacrifice in generalisation is acceptable,
    then PR-conjugate gradient error backpropagation tends to be superior. If the
    training set is very large, stochastic epoch-based versions of local optimizers should
    be chosen utilizing a larger rather than a smaller epoch size to avoid inacceptable
    instabilities in the generalization results. (authors' abstract)
    Original languageEnglish
    Place of PublicationVienna
    PublisherWU Vienna University of Economics and Business
    Publication statusPublished - 1 Mar 1998

    Publication series

    SeriesDiscussion Papers of the Institute for Economic Geography and GIScience
    Number62/98

    WU Working Paper Series

    • Discussion Papers of the Institute for Economic Geography and GIScience

    Cite this