Optimization in an Error Backpropagation Neural Network Environment with a Performance Test on a Pattern Classification Problem

    Publikation: Working/Discussion PaperWU Working Paper

    31 Downloads (Pure)

    Abstract

    Various techniques of optimizing the multiple class cross-entropy error function
    to train single hidden layer neural network classifiers with softmax output transfer
    functions are investigated on a real-world multispectral pixel-by-pixel classification
    problem that is of fundamental importance in remote sensing. These techniques
    include epoch-based and batch versions of backpropagation of gradient descent,
    PR-conjugate gradient and BFGS quasi-Newton errors. The method of choice
    depends upon the nature of the learning task and whether one wants to optimize
    learning for speed or generalization performance. It was found that, comparatively
    considered, gradient descent error backpropagation provided the best and most stable
    out-of-sample performance results across batch and epoch-based modes of operation.
    If the goal is to maximize learning speed and a sacrifice in generalisation is acceptable,
    then PR-conjugate gradient error backpropagation tends to be superior. If the
    training set is very large, stochastic epoch-based versions of local optimizers should
    be chosen utilizing a larger rather than a smaller epoch size to avoid inacceptable
    instabilities in the generalization results. (authors' abstract)
    OriginalspracheEnglisch
    ErscheinungsortVienna
    HerausgeberWU Vienna University of Economics and Business
    DOIs
    PublikationsstatusVeröffentlicht - 1 März 1998

    Publikationsreihe

    ReiheDiscussion Papers of the Institute for Economic Geography and GIScience
    Nummer62/98

    WU Working Paper Reihe

    • Discussion Papers of the Institute for Economic Geography and GIScience

    Zitat