I try to add values of some ragged tensor with each value of another Tensor separately and to get as a result another ragged tensor with one more dimension. But I get a broadcasting error:
Unable to broadcast: dimension size mismatch in dimension=[2]. lengths=[3] dim_size=[1, 1, 1, 1, 1]
So, this error occures instead of broadcasting along the ragged dimension of size 1 as common tf broadcasting rule says.
The simplest example is as follows:
This works (with ragged_rank=1
)
import tensorflow as tf
x = tf.ragged.constant(
[
[[1,2,3,4], [2,5,7,8]],
[[3,12,8,9],[0,0,1,1],[4,4,9,7]],
],
ragged_rank=1)
y = tf.constant([10, 20, 30])
x = x[:,:,tf.newaxis,:]
y = y[:,tf.newaxis]
print(x+y)
But this doesn't (with ragged_rank=2
)
import tensorflow as tf
x = tf.ragged.constant(
[
[[1,2,3,4], [2,5,7,8]],
[[3,12,8,9],[0,0,1,1],[4,4,9,7]],
],
ragged_rank=2)
y = tf.constant([10, 20, 30])
x = x[:,:,tf.newaxis,:]
y = y[:,tf.newaxis]
print(x+y)
I have to deal with ragged_rank=2
r.t. because I get it as an input of my tf.data.Dataset
batch pipeline's map function.
I'll also appreciate a workaround of reducing ragged_rank
to 1 (if it's possible), because the inner dimension is supposed to be uniform (of length 4)
UPD: Ok, I managed to downgrade the ragged_rank by recreating a tensor like so:
def downgrade_bbox_batch_ragged_rank(x, inner_len=5):
v = x.flat_values
rs = x.row_splits
return tf.RaggedTensor.from_row_splits(values=tf.reshape(v,(-1,inner_len)),
row_splits=rs)
After that adding the newaxis just before or in the inner dimensions (of the flat_values) works fine in the term of broadcasting. But adding newaxis before ragged dimensions still not working. Is it an expected behaviour?