Keywords - Function groups - @ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Library: nn

addr creates one hidden layer network
ann is a tool to run a feed-forward neural network
committee This macro computes a committee of networks with nets of the form single layer feedforward perceptron. The macro can be used alone or in connection with the library ISTA. The standalone version also needs the parameter data. Just choose 0 for the input. The number of nets to build the committee can be chosen. The data will be splitted with this number to build the different datasets. The weight for the cases for the training of the net can be chosen, the numbers of hidden units and additional information concerning the weights of the units. Different optional parameters can be chosen to build the architektur of the network. The choice holds for every single net. The default values are chosen in order to solve a linear regression problem. The optional parameters constits of 8 values. Boolean values for linear output, entropy error function, log probability models and for skip connections (direkt links). The fifth values is the maximum value for the starting weights, the sixth is the weight decay, the seventh the maximum number of iterations and the the last value generates the output concerning the architekur of the net if it is equal to one. The output consits of the Error and MSE of the single nets and for all cases. Additionally the R^2 for the average of the nets and the R^2 of the committee are shown.

cv runs a cross validation over the hidden units
cvdec runs a cross validation over the weight decay
erfkl Kullback-Leibler criterion for classification
erfqua (1-R^2) criterion for regression
finalshow shows the final visualization of the network
gennet generates interactively a feedforward network
neuronal This macro computes different networks of the form single layer feedforward perceptron. The macro can be used alone or in connection with the library ISTA. The standalone version also needs the parameter data. Just choose 0 for the input. It is possible to split the data in a training and a test set. The weight for the cases for the training of the net can be chosen, the numbers of hidden units with ``from, stepwidth, to'' and additional information concerning the weights of the units. Different optional parameters can be chosen to build the architektur of the network. The choice holds for every single net. The default values are chosen in order to solve a linear regression problem. The optional parameters constits of 8 values. Boolean values for linear output, entropy error function, log probability models and for skip connections (direkt links). The fifth values is the maximum value for the starting weights, the sixth is the weight decay, the seventh the maximum number of iterations and the the last value generates the output concerning the architekur of the net if it is equal to one. The output consits of the Error and MSE of the different nets (MSE for test and trainings data separately if chosen) and the R^2.
nninfo shows some information about the actual network
nnlayer builds a feedforward network
nnmain loads the necessary libraries
nnrinfo shows information about the net in the output window.
nnrload loads a network from different files
nnrnet trains a one hidden layer feed forward network. The optional parameter param consists of 8 values. Boolean values for linear output, entropy error function, log probability models and for skip connections. The fifth value is the maximum value for the starting weights, the sixth the weight decay, the seventh the number of maximal iterations and the last value generates some output if equal to one.
nnrpredict estimates for a given net and dataset the response.
nnrsave saves a network in different names
optdec runs for each set of observations a neural network to estimate the generalization error
readshow shows the visualization of a feedforward neural network
resclass shows the residuals in case of the classification
resreg shows the residuals in case of the regression
runcv runs a cross validation and estimates the generalization error
runinit initializes the training andtest dataset, the errors and the weights in the network
runnet runs a network with prespecified optimization method
runnew optimize a neural network by a quadratic approximation
runqsa optimizes a neural network by a stochastic search
runsa optimizes a neural network by Boltzman annealing
runshow visualizes a neural network during optimization
weidist1 transforms weights in distances (\delta^{(2)})
weidist2 transforms weights in distances (\delta^{(2)})
weidist3 transforms weights in distances
weinit initializes the weights of a neural network
x3matrix constructs a matrix linke in XploRe 3

Keywords - Function groups - @ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

(C) MD*TECH Method and Data Technologies, 28.6.1999