Relation between perceptron accuracy and epoch

1.3k Views Asked by At

Is it possible that the accuracy of perceptron decreases as I go through the training more times? In this case, I use the same training set several times.

2

There are 2 best solutions below

0
carlosdc On

Yes.

This is a commonly studied phenomenon, the accuracy on never-before-seen data (testing data) start to decrease after certain point (after a certain number of passes through the training data-- what you call epochs). This phenomenon is called overfitting and is well understood. You want to stop early, as early as possible or use regularization.

0
lennon310 On

Neither the accuracy on training data set nor on test data set is stable as the epoch increases. Actually the experimental data indicated that the trend of either in-sample-error or out-sample-error is not even monotonic. And a "pocket" strategy is often applied. Unlike early stop, the pocket algorithm keeps the best solution seen so far "in its pocket" instead of the last solution.