Introduction - If you have any usage issues, please Google them yourself
Train a two layer neural network with the Levenberg-Marquardt method. If desired, it is possible to use regularization by weight decay. Also pruned (ie. not fully connected) networks can be trained. Given a set of corresponding input-output pairs and an initial network, [W1, W2, critvec, iteration, lambda] = marq (NetDef, W1, W2, PHI, Y, trparms) trains the network with the Levenberg-Marquardt method . The activation functions can be either linear or tanh. The network architecture is defined by the matrix NetDef which has two rows. The first row specifies the hidden layer and the second row specifies the output layer.