Comparing two neural networks (nntool in Matlab)

1k Views Asked by At

I'm new to the Neural Network Toolbox (nntool) in Matlab. I have trained two networks using the same data set. One of these networks contains a higher number of neurons as the other one.

Now I'm wondering: how can I compare these networks? How can I say network A is better than network B?

Is it all about the number of correctly classified pattern in my test set? Lets say both networks were shown the same test set and network A classified more pattern correctly. Can I say network A is (in general) better than network B?

Or should I also look at the performance according to my performance function?

Are there any other measures for comparing two networks trained with different parameter?

1

There are 1 best solutions below

0
On

That mainly depends on what is your concern. As I see, in most cases analyzing the predicted labels, or accuracy of the nets can lead to a good pickup decision, especially when your networks have shallow architectures,however there are some side-handed issues that may become more important when you decide to see the nets with wider eyes.

  • For example, in the training phase, adding even one hidden unit to the first hidden layer comes up with inserting d (dimension of input layer) free parameters (weights) to your model that should be estimated. In other hand, more free parameters your model has, more training data is required to come up with a reliable model. Therefore, bigger networks are well-accepted as long as you have enough data to compensate for the added free parameters. As rule of thumb, inserting more free parameters increase the chance of over-fitting which has been a vital problem in deep neural networks and many efforts has been made to resolve it.
  • Another case which is less important in shallow nets, is the computational cost imposed by extra hidden nodes. Since we are looking with wide eyes, mentioning this issue is somewhat necessary. In cases when your network goes deeper, this computational cost becomes more challenging. The computational cost in training phase is also an important issue when you use back-propagation to update the parameters.
  • One other thing that you may mainly see in deep neural networks is the memory requirements. As the number of layers or neurons increase, the number of free parameters grows dramatically such that in deep networks you may see millions of parameters. It is clear that loading this amount of parameters asks for sufficient hardware requirements.

hope it helps.