*Bounty: 100*

*Bounty: 100*

I’m reading a Neural Networks Tutorial. In order to answer my question you might have to take a brief look at it.

I understand everything until something they declare as "chain function":

It’s located under 4.5 Backpropagation in depth.

I know that the chain rule says that the derivative of a composite function equals to the product of the inner function derivative and the outer function derivative, but still don’t understand how they got to this equation.

**Update:**

Please read the article and try to only use the terms and signs that are declared there. It’s confusing and tough enough anyway, so I really don’t need extra-knowledge now. Just use the same "language" as in the article and explain how they arrived to that equation.