Mxnet MNIST training example returns almost constant rmse

456 Views Asked by At

I did a little modification to the fit.py mxnet python example file (image-classification) by adding the rmse loss:

# evaluation metrices
eval_metrics = ['accuracy']
eval_metrics.append('rmse') 

Then running the MNIST training example, one can observe that the rmse is about 5.2 all the way, while the accuracy goes up to around 99%.

Should we not observe a decreasing RMSE ?

Many thanks AL

1

There are 1 best solutions below

3
On BEST ANSWER

Root Mean Square Error (RMSE) is a metric that is used for regression problems. In a regression problem, the network is predicting a real number and the quality of that prediction can be measured as a function of the numerical difference between the predicted value and the expected value.

In classification, the network is assigning labels to the data and RMSE is not an appropriate measure to calculate the quality of the predicted label since the prediction is not a real number. Cross entropy error is a more appropriate metric for classification problems.

In this case, you can use cross entropy error like this:

eval_metrics = ['ce']
eval_metrics.append(mx.metric.create('ce'))