What is the default learning rate for TensorFlowDNNRegressor with SGD or Adagrad?

4k Views Asked by At

This is probably an easy question, but I just can't find it. But I'm also pretty new to all this, so maybe I'm just blind.

What is the default learning rate when using TensorFlowDNNRegressor with SGD or Adagrad? The default when using Adam or Adadelta seems to be 0.001, but I cannot find a default for Adagrad which is the default optimizer for TensorFlowDNNRegressor, nor for classic SGD.

Thanks!

2

There are 2 best solutions below

2
On BEST ANSWER
1
On

AdaGrad doesn't need a learning rate as it adapts component-wise (thus the name). A pretty concise comment: https://xcorr.net/2014/01/23/adagrad-eliminating-learning-rates-in-stochastic-gradient-descent/