how are important weights defined in a neural network?

157 Views Asked by At

So I have weights of a pretrained neural network but I'm kinda lost as to what each of the numbers mean. At all the neurons and at every layer of a netowrk, what do negative weights and positive weights mean? Does a weight that's away from 0 mean that it's very important?

1

There are 1 best solutions below

6
On

First of all, are you sure that you need to understand those numbers? Large CNNs and RNNs may have millions of parameters.

The answer:

  1. Sign of weight means almost noting, it's like a coefficient in an equation.
  2. The absolute value of weight - distance from zero, though, means a lot. Large-abs weights produce strong output (which may be a sign of an overfitting)