Removing low quality tensor predictions from softmax

173 Views Asked by At

I want to apply a filter to a tensor and remove values that do not meet my criteria. For example, lets say I have a tensor that looks like this:

softmax_tensor = [[ 0.05 , 0.05, 0.2, 0.7], [ 0.25 , 0.25, 0.3, 0.2 ]]

Right now, the classifier picks the argmax of the tensors to predict:

predictions = [[3],[2]]

But this isn't exactly what I want because I loose information about the confidence of that prediction. I would rather not make a prediction than to make an incorrect prediction. So what I would like to do is return filtered tensors like so:

new_softmax_tensor = [[ 0.05 , 0.05, 0.2, 0.7]]
new_predictions    = [[3]]

If this were straight-up python, I'd have no trouble:

new_softmax_tensor = []
new_predictions    = []

for idx,listItem in enumerate(softmax_tensor):
    # get two highest max values and see if they are far enough apart
    M  = max(listItem)
    M2 = max(n for n in listItem if n!=M)
    if M2 - M > 0.3: # just making up a criteria here
        new_softmax_tensor.append(listItem) 
        new_predictions.append(predictions[idx])

but given that tensorflow works on tensors, I'm not sure how to do this - and if I did, would it break the computation graph?

A previous SO post suggested using tf.gather_nd, but in that scenario they already had a tensor that they wated to filter on. I've also looked at tf.cond but still don't understand. I would imagine many other people would benefit from this exact same solution.

Thanks all.

2

There are 2 best solutions below

0
On BEST ANSWER

Ok. I've got it sorted out now. Here is a working example.

import tensorflow as tf

#Set dummy example tensor
original_softmax_tensor = tf.Variable([
    [0.4,0.2,0.2,0.9,0.1],
    [0.5,0.2,0.2,0.9,0.1],
    [0.6,0.2,0.2,0.1,0.99],
    [0.1,0.8,0.2,0.09,0.99]
    ],name='original_softmax_tensor')

#Set dummy prediction tensor
original_predictions    = tf.Variable([3,3,4,4],name='original_predictions')

#Now create a place to store my new variables
new_softmax_tensor = original_softmax_tensor
new_predictions    = original_predictions


#set my cutoff variable
min_diff = tf.constant(0.3)

#initialize
init_op = tf.global_variables_initializer()


with tf.Session() as sess:
    sess.run(init_op) #execute init_op
    #There's probably a better way to do this, but I had to do this hack to get
    # the difference between the top 2 scores
    tmp_diff1, _ = tf.nn.top_k(original_softmax_tensor,k=2,sorted=True)
    tmp_diff2, _ = tf.nn.top_k(original_softmax_tensor,k=1,sorted=True)
    #subtracting the max scores from both, makes the largest one '0'
    actual_diff = tf.subtract(tmp_diff2,tmp_diff1)
    #The max value for each will be the actual value of interest
    actual_diff = tf.reduce_max(actual_diff,reduction_indices=[1])
    #Create a boolean tensor that says to keep or not
    cond_result = actual_diff  > min_diff
    #Keep only the values I want
    new_predictions = tf.boolean_mask(original_predictions,cond_result)
    new_softmax_tensor = tf.boolean_mask(new_softmax_tensor,cond_result)
    new_predictions.eval()
    new_softmax_tensor.eval()
    # return these if this is in a function
1
On

Two things that I would do to solve your problem :

First, I would return the value of the softmax tensor. You look for a reference to it somewhere (you keep a reference to it when you create it, or you find it back in the appropriate tensor collection) And then evaluate it in a sess.run([softmaxtensor,prediction],feed_dict=..) And then you play with it with python as much as you like.

Second If you want to stay within the graph, I would use the build-it tf.where(), working quite alike the np.where function from numpy package doc there