*Bounty: 50*

*Bounty: 50*

I’m currently looking into the different training methods of NN. I’ve implemented a simple gradient descent method using backpropagation in my NN (just a very simple NN with 1 hidden layer). I’m now struggling with implementing a more sophisticated training (weights and biases) algorithm, i.e. the Levenberg-Marquardt. Is there also such an elegant method which uses backpropagation to calculate the jacobian?

Edit: I’ve found [1] which is a really interesting article and is almost what I’m looking for. Nevertheless, I don’t understand how I propagate through the different layers because I only have the error $e$ for the last layer (output layer). However, I want to adjust two weigth matrices (from input to hidden layer and from hidden layer to output).

[1] M. T. Hagan and M. B. Menhaj, "Training feedforward networks with the Marquardt algorithm," in IEEE Transactions on Neural Networks, vol. 5, no. 6, pp. 989-993, Nov. 1994, doi: 10.1109/72.329697.