My training data contains ~1500 labels(string,one label per record) and I want to do batch training (just load one batch into memory to update weights in a neural network). I was wondering if there is a class in tensorflow to do one hot encoding for the labels in each batch? Something like in sklearn we can do
onehot_encoder = OneHotEncoder(sparse=False)
onehot_encoder.fit(entire training labels)
And then in each batch in tensorflow session, I can transform my batch label and feed into tensorflow for traning
batch_label = onehot_encoder.transform(batch training labels)
sess.run(feed_dict={x:...,y:batch_label)
An example will be appreciated. Thanks.
I think this post is similar to that one Tensorflow One Hot Encoder?
A sort answer from this link http://www.tensorflow.org/api_docs/python/tf/one_hot
Just posting it to save your time ;)