#StackBounty: #neural-networks #conv-neural-network How to add appropriate noise to a neural network with constant weights so that back…

Bounty: 50

I have a neural network in a synthetic experiment I am doing where scale matters and I do not wish to remove it & where my initial network is initialized with a prior that is non-zero and equal everywhere.

How do I add noise appropriately so that it trains well with the gradient descent rule?

$$w^{} := w^{} – eta nabla_W L(W^{})$$

cross-posted:

Get this bounty!!!

This site uses Akismet to reduce spam. Learn how your comment data is processed.