One Hot Encoding in Tensor flow for Batch Training

2.4k Views Asked by At

My training data contains ~1500 labels(string,one label per record) and I want to do batch training (just load one batch into memory to update weights in a neural network). I was wondering if there is a class in tensorflow to do one hot encoding for the labels in each batch? Something like in sklearn we can do

onehot_encoder = OneHotEncoder(sparse=False)
onehot_encoder.fit(entire training labels)

And then in each batch in tensorflow session, I can transform my batch label and feed into tensorflow for traning

batch_label = onehot_encoder.transform(batch training labels)
sess.run(feed_dict={x:...,y:batch_label)

An example will be appreciated. Thanks.

1

There are 1 best solutions below

0
On

I think this post is similar to that one Tensorflow One Hot Encoder?

A sort answer from this link http://www.tensorflow.org/api_docs/python/tf/one_hot

indices = [0, 1, 2]
depth = 3
tf.one_hot(indices, depth)  
# output: [3 x 3]
# [[1., 0., 0.],
#  [0., 1., 0.],
#  [0., 0., 1.]]

Just posting it to save your time ;)