:)
While I was defending my thesis proposal, one of my professors asked me why do we have to specify the number of iterations in SOM? He said, there should've been a convergence criterion for us to stop training.
However, I understand that we do not have a target vector and thus we could not minimize the cost.
My question is first, why is there a need for the MAX_ITERATIONS and second, what assures us that the number of iterations we chose would give the optimal map. :(
P.S. Based on experience I tried using 1000 iterations and 10000 iterations on the color dataset. It seems that 10000 iterations does not give a better visualization that 1000. :(
So, both you and your professor are right: you should specify a hard cap on the number of iterations AND a convergence criterion.
convergence criterion - While your right in that SOMs are unsupervised and thus don't have target vectors, they can still be seen as minimizes some cost function. In general, most unsupervised machine learning methods can try and do things like minimizing the unaccounted for variance, maximize the information gain, etc. For SOMs specifically, I'd used the weight deltas as a criterion. That is to say, when an additional iteration isn't changing the SOMs weights by more than some threshold, stop iterating.
Iteration cap - Even though a convergence criterion is in place, a hard cap is necessary in case the SOM doesn't converge (you don't want it to run forever). If you used my example criterion of weight deltas, then there could be a case where the weights kept oscillating between iterations, thus causing the criterion to never be met.
Happy SOMing!