A Neural Network Classifier for Spectral Pattern Recognition. On-Line versus Off-Line Backpropagation Training

    Publikation: Working/Discussion PaperWU Working Paper

    23 Downloads (Pure)

    Abstract

    In this contributon we evaluate on-line and off-line techniques to train a single
    hidden layer neural network classifier with logistic hidden and softmax output transfer
    functions on a multispectral pixel-by-pixel classification problem. In contrast to
    current practice a multiple class cross-entropy error function has been chosen as the
    function to be minimized. The non-linear diffierential equations cannot be solved in
    closed form. To solve for a set of locally minimizing parameters we use the gradient
    descent technique for parameter updating based upon the backpropagation technique
    for evaluating the partial derivatives of the error function with respect to the
    parameter weights. Empirical evidence shows that on-line and epoch-based gradient
    descent backpropagation fail to converge within 100,000 iterations, due to the fixed
    step size. Batch gradient descent backpropagation training is superior in terms of
    learning speed and convergence behaviour. Stochastic epoch-based training tends to
    be slightly more effective than on-line and batch training in terms of generalization
    performance, especially when the number of training examples is larger. Moreover, it
    is less prone to fall into local minima than on-line and batch modes of operation. (authors' abstract)
    OriginalspracheEnglisch
    ErscheinungsortVienna
    HerausgeberWU Vienna University of Economics and Business
    DOIs
    PublikationsstatusVeröffentlicht - 1 Dez. 1997

    Publikationsreihe

    ReiheDiscussion Papers of the Institute for Economic Geography and GIScience
    Nummer60/97

    WU Working Paper Reihe

    • Discussion Papers of the Institute for Economic Geography and GIScience

    Zitat